Placeholder Image

字幕列表 影片播放

  • [MUSIC PLAYING]

  • DAN AHARON: Hi, everyone.

  • I'm pleased to be here with you.

  • I hope you're having an awesome Google Cloud Next.

  • I'm here to talk to you about something that I'm personally

  • very excited about, and we at Google

  • are very excited about, which is conversational applications,

  • which includes Bots, and I hope you're

  • as excited as me at the beginning of the session,

  • or if not, by the end of it.

  • Here we're going to introduce API.AI

  • to you guys, which is a tool probably many of you

  • are familiar with.

  • It's our Bot application development platform

  • that we acquired last year.

  • And we're also going to go beyond the basics

  • and show you guys a little bit more tips and tricks.

  • So you're going to hear from a couple people, Cornelius

  • and Petr, that have been using API.AI,

  • and they're going to share some of the lessons

  • that they've learned while building conversational Bots.

  • So, this is one of three different sessions

  • we have on this topic of conversational applications.

  • We had one yesterday which focused

  • on cloud functions and surveillance and architecture

  • to serve conversational applications.

  • We're now doing the middle session

  • that you see on the slide.

  • And then, following after me, Brad and Guilliame

  • are going to show you how to extend Google assistant

  • with actions on Google.

  • So, what is a conversational agent platform?

  • We'll just go through this quickly, probably a lot of you

  • already know, but, basically what it does

  • is it takes natural language and it turns it

  • into structured data that you can then

  • use in your applications.

  • So, for example, if a user orders

  • a pizza and says what ingredients it has,

  • the software will turn that into structured data which

  • says what's exactly inside the pizza, what type it is,

  • and then you can actually act on it.

  • What it doesn't do is, it doesn't really

  • understand your domain and your vertical

  • until you actually teach it and train it.

  • You can give training examples that

  • explain exactly what to do.

  • And, it also doesn't do any fulfillment.

  • It's not actually going to bake a pizza

  • and deliver it for you, unfortunately.

  • That would be nice.

  • But, the good thing is you're here at Google Cloud Next,

  • you've probably been to a bunch of sessions.

  • Most of this conference is about how

  • to fulfill the back-end of your application,

  • and we think Google Cloud Platform is a great way

  • to do that.

  • So, it's a very good complement to API.AI

  • And then, specifically, we're very proud of API.AI,

  • we think it's pretty distinctive in the space.

  • A few of the important benefits that API.AI

  • has that it is an end-to-end suite,

  • it really combines a bunch of different elements

  • that you need for building compositional applications.

  • It doesn't just do natural language understanding, or just

  • one component.

  • It can get pretty smart very quickly.

  • You're going to see that later on.

  • I'm going to attempt to do something that's probably very

  • risky, and not very wise of me.

  • I'm going to try and build in front of you,

  • from scratch, a whole new conversational application,

  • in less than 20 minutes, that handles an imaginary scenario.

  • Hopefully, it will work, we'll see.

  • But, you can see the power of API.AI

  • with a very small data set, what it's able to do.

  • It's multi-lingual, it already supports 40 languages,

  • and we're investing in growing that.

  • We want it to be as global as Google is.

  • There was a lot of speculation last year

  • when we bought API.AI that we will turn it

  • into sort of a closed system that only

  • works with Google software.

  • The truth is, it's far from it.

  • API.AI works with all of the different messaging platforms

  • that are out there.

  • And we're very proud of that.

  • We're a very open company at Google,

  • we actually want you to build the best

  • conversational applications that work across all the platforms

  • that you're interested in.

  • And we want to keep making API.AI the best

  • cross-platform tool.

  • So we're going to continue to support

  • all of the platforms that are out there.

  • I'm just going to go through this very quickly,

  • but this is just an example architecture

  • of how you can build a conversational application.

  • So, on the left-hand side, you can

  • see all of the different channels

  • where conversations can come in.

  • So, it starts from owned and operated websites

  • or web apps or mobile apps.

  • And you could have Google Assistant properties

  • like Pixel, Aloe, and Home, or you could have integrations

  • with the different applications that I just showed you,

  • all of the different messaging applications.

  • The other thing is you could have voice calls come in

  • and be transcribed with Cloud Speech API, and flown in.

  • All of those can get processed by API.AI,

  • which handles the conversational aspect, including context,

  • natural language understanding, and figures

  • out how to involve the back-end.

  • And then, through web hook integration,

  • you can connect it to Google Cloud Platform

  • to handle all of your back-end needs, or any other platform

  • that you want.

  • You can connect to other external systems like ERP,

  • like CRM, or whatever you need on the back-end.

  • You can get there through cloud functions or any other Google

  • top-form application cloud service.

  • OK, so now to the demo.

  • So what I'm going to do now, we're

  • going to look at an imaginary Google hardware store.

  • And we're going to try and create

  • a conversational app that processes requests

  • for that imaginary store, including service

  • requests and commerce requests.

  • So we're going to just quickly go through defining an entity,

  • defining the intent, adding an ability to buy other items

  • through a WebHook, and then we're

  • going to connect it to a messaging platform.

  • OK, so let's switch to the demo now.

  • OK, can everyone see my screen?

  • How's the font size?

  • Is it OK?

  • Yes.

  • People are saying yes.

  • OK, great.

  • OK, so let's start by defining an entity.

  • And we'll list all of the products

  • that we want to sell in this store.

  • So let's say we want to sell a Pixel.

  • And sometimes that is called Google Pixel.

  • And it could also be in plural, so let's say

  • Pixels and Google Pixels.

  • And let's say you also want to sell a home, which could also

  • be Google Home.

  • And let's add the plural.

  • And let's add a Chromecast to that.

  • Chromecasts.

  • And what else?

  • Let's add a Chromebook.

  • OK, and that's enough.

  • Let's save our entity.

  • And now let's define our first intent.

  • So let's make the first intent about service.

  • And let's just add examples of how

  • users would ask for service.

  • So maybe one guy will say, I'd like to fix my Pixel.

  • And you can see that API.AI automatically

  • identified that Pixel is one of the products

  • that I identified earlier and it labeled it as a parameter

  • name called Products.

  • I'm just going to change it to Product, because you just

  • want one product.

  • I'm going to save this.

  • And let's add a few more examples.

  • Let's say, can I please buy a Chromecast?

  • I'd like to get a Chrome.

  • Can I please-- oh, sorry.

  • I'm mixing commerce and service.

  • Let me redo this.

  • I'd like to fix my Pixel.

  • Can I please fix my Chromecast?

  • My Chromebook is not working.

  • And let's start with that.

  • And let's test it now.

  • So I'm going to take this example,

  • but use a different product.

  • So instead of I'd like to fix my Pixel,

  • let's try I'd like to fix my Chromebook.

  • Oh, OK, so that's great.

  • So what you see is it identified an intent that is service.

  • And it identified a parameter name Product

  • that is Chromebook, which is exactly what we wanted.

  • Now let's add a response.

  • Let's say, sure, no problem.

  • Dispatching service for product right away.

  • Save.

  • And let's test it.

  • Can I please fix my Chromecast?

  • And you can see, it says, sure, no problem.

  • Dispatching service from Chromecast right away.

  • And if I click on Show JSON, you can

  • see there's this very rich JSON that you

  • can send to your back end, to your application that

  • can actually process everything that you need to do to actually

  • dispatch someone out there.

  • It's all structured information that you can act on right away.

  • So this is great.

  • And we have something that is doing service right now.

  • Let's also add a Commerce intent.

  • And now let's say, I'd like to buy three Chromecasts, please.

  • OK, so it identifies quantity and product.

  • So let's name this one as Quantity.

  • And instead of Products, let's just call it Product.

  • And let's show a few more examples.

  • So can I please get five Pixels?

  • OK, so this one, it didn't auto recognize,

  • but we can easily fix that.

  • I'm going to remove this.

  • And instead, I'm going to mark the 5 as Quantity

  • and pixels as Product.

  • And let's give it a couple more examples.

  • Can I buy seven Chromebooks?

  • Please get me two Pixels.

  • OK, so it looks like it's working.

  • Let's also add a shipping address.

  • So let's say, can you please send me four cool homes

  • to 5 Fifth Street, New York, New York?

  • OK, so it recognized that we have an address here.

  • It marked it.

  • Let's just show it the full thing.

  • And then let's just mark quantity.

  • And I'm going to save this.

  • And let's add a response.

  • Adding quantity, product, and sending to address.

  • I think this one we don't need.

  • Let's delete it.

  • And let's save.

  • And let's test it out.

  • Let's say, can I buy five Pixels?

  • OK, let me try, I'd like to buy five Pixels, please.

  • OK, you can see the Intent, it recognizes commerce.

  • It got that it's a Pixel.

  • It got that it's five.

  • And the address is missing, obviously,

  • because I didn't give an address.

  • And so it can't really respond, because it

  • doesn't have an address.

  • But everything is working well.

  • Now, the previous sentence I gave

  • should have been recognized, but it wasn't recognized.

  • So this is where API.AI really shines.

  • You can actually go in real time and apply new training

  • based on the data, the experiments you just made.

  • So this is what we tried earlier.

  • Service for Chromebook.

  • That was correct.

  • So I can mark it as correct.

  • And please fix my Chromecast.

  • That also worked well earlier.

  • And this one did not work.

  • Can I buy five Pixels?

  • Oh, there was a spelling mistake here.

  • Maybe that's why it didn't work.

  • But I can go back and actually assign this training data

  • to what should have happened.

  • And I can say, this should have been 5 quantity, not

  • unit length quantity.

  • And Pixel should have been a product.

  • I can mark it with a checkmark.

  • And then from now on, it's going to add that to the examples

  • it knows how to recognize.

  • And then this one should be Commerce.

  • Quantity.

  • But I want to track this.

  • And Pixels is the product.

  • And checked.

  • I'm going to approve it.

  • And now let's try another one.

  • Let's try, can I buy nine Chromecasts?

  • Great.

  • Recognize commerce.

  • Recognizes Chromecast.

  • Recognizes nine.

  • And now we see that it's the same problem again,

  • that it doesn't have an address, so it can't answer.

  • So this is something really cool about API.AI.

  • You can very easily turn this into a conversation

  • where if I mark these three parameters as required,

  • I can now talk to the user and ask

  • it to fill the required fields.

  • So if it's missing a quantity, I can say,

  • how many would you like to buy?

  • And then for product, I can ask, which product are you

  • interested in?

  • Notice that there's room for multiple prompts.

  • This is because we found that sometimes you

  • want to ask the same question with multiple variations just

  • to make it more interesting for users.

  • So let's also ask, where would you like me to ship it to?

  • And let's save this.

  • And then let's try this again.

  • Can I buy seven Chromecasts?

  • OK, and so now you can see it's responding.

  • It says, where would you like me to ship it to?

  • And now I can tell it, 1 First Street, San Francisco,

  • California.

  • And then it responds, adding seven Chromecasts

  • and sending it to 1 First Street,

  • San Francisco, California.

  • OK, this is all good and pretty awesome.

  • Let's try one last one.

  • And then make this a little bit more complex.

  • Let's try one that's missing quantity.

  • So I'd like to buy a few Pixels.

  • Now, I never actually trained it on a sentence

  • that looks like that.

  • If you look at the other ones, there's

  • no place where I say a few.

  • But it already understands that I want to do commerce.

  • Actually, it didn't get the product this time.

  • We'll fix that in a second.

  • But I guess it's because of the few.

  • How many would you like to buy?

  • Let's give it eight Pixel And where would you

  • like me to ship it to?

  • 7 Seventh Street, San Francisco, California.

  • And now you can see it finishes the full thing.

  • So we can go back on training and just tell it

  • that if you have something with few,

  • that still is a legitimate ask.

  • OK, so now let's make it a little bit more interesting.

  • Now, let's say that you want to be

  • able to feed this into your back end application

  • into some sort of a shopping cart

  • and be able to add sort of multiple items

  • from the same kind.

  • So what I want to do here is to have some sort of counter,

  • where I send this counter to my back end, my back end

  • increments by one, and then sends it back to API.AI

  • with an answer.

  • So the way I'm going to do this is I'm

  • going to create what we call a context.

  • So I'm going to create a context called Shopping Cart.

  • And now every time someone does a commerce operation,

  • it's going to save it in the shopping cart.

  • And I'm going to create another intent that requires

  • Shopping Cart as an input.

  • So this will now only happen if someone has a shopping cart.

  • And let's say someone says, please add one more

  • or I'd like to buy another one or another one, please.

  • Let's call this Buyanother.

  • Buyanother.

  • And now let's add from Shopping Cart, the product.

  • And let's add the quantity and let's add the address.

  • So number and address and shopping cart quantity

  • and shopping cart address.

  • OK, let's save.

  • Now, what I'm going to do is I'm going

  • to connect it to my WebHook.

  • And if it doesn't work, I'm just going

  • to put error, so that we know that it didn't actually

  • go to the WebHook.

  • And I just want to show you guys how that looks like.

  • So you can see, this is a WebHook

  • I created in Cloud Functions earlier.

  • That basically increments one to a context

  • that comes from API.AI.

  • I can show you guys the actual code.

  • It's basically four or five lines of code.

  • That's all.

  • And no JS, by the way, if someone's interested.

  • So let's try it out.

  • Let's see if it works.

  • So let's start by, I'd like to buy a Chromebook, please.

  • Let's make it two Chromebooks, please.

  • Where would you like me to ship it to?

  • Let's say, 7 Seventh Street, San Francisco.

  • So it's great.

  • It responds that it's adding it to my shopping cart.

  • I get a context of a shopping cart.

  • And now let's ask it for another one.

  • And so you can see the response.

  • Adding one more Chromebook, you now

  • have three in your shopping cart.

  • And remember the context of a conversation

  • that you have a Chromebook.

  • It's sent it to our back end, the back end incremented

  • by one, sent it back, and now you're

  • seeing the response at three.

  • If I try and trigger this again, please add one more,

  • you'll see it says you now have four in your shopping

  • cart of the Chromebooks.

  • So what you see here is I didn't measure time,

  • but I think it's something like 10,

  • 15 minutes, we basically built this whole conversation

  • application.

  • It does commerce.

  • It does service.

  • It'll get better the more we add more examples to it,

  • the more it gets usage data, and the more that we train it.

  • But it can already handle a lot of queries pretty easily.

  • Now, what I would like to do is I'm going to connect it here

  • to Facebook Messenger.

  • And you can see we have one-click integrations.

  • Like, very easy to start and stop.

  • Let's just start it.

  • And I'm going to go to Facebook Messenger.

  • And now let's say, I'd like to buy a Pixel, please.

  • Obviously, it's not working.

  • OK, perfect.

  • How many would you like to buy?

  • [APPLAUSE]

  • Go ahead and tell it six.

  • Thank you.

  • Let's give it an address.

  • 6 6th Street, San Francisco, California.

  • And now let's try the last thing,

  • another one where it goes directly to our WebHook.

  • OK, now you can say it says seven.

  • So that's it, guys.

  • [APPLAUSE]

  • Thank you very much.

  • We got through this together.

  • You can see just how easy it is to create a conversation

  • application that's really powerful.

  • And with ease, you can connect it automatically

  • to all of those cross-platform applications you saw earlier.

  • You can put it on your own website,

  • in your own mobile app, or work through it

  • with the different messengers.

  • Very, very exciting.

  • So there'll be Q&A later.

  • But now I'd like to invite to the stage Cornelius that

  • is going to show us a little bit more

  • about what they've done at Bosch.

  • CORNELIUS MUNZ: Thanks.

  • [APPLAUSE]

  • So thanks, Dan.

  • My name is Cornelius.

  • I'm working at Bosch Thermotechnology.

  • We are Europe's leading OEM for heating appliances, boilers,

  • appliances who prepare hot water for you.

  • So can we switch to the slide deck?

  • Slide disk.

  • OK, I go on.

  • OK, thanks.

  • OK, and our devices become more and more

  • smarter the last years.

  • So we have now out of the box IT connectivity

  • are built in in our devices.

  • Are if you sell a Bosch appliance, a Bosch [INAUDIBLE],

  • you will get out of the box AIPI support.

  • You can connect your app to it and remote control it.

  • So that's fine.

  • But we have a problem.

  • We lose the contact more and more to the customer.

  • So how get the interface back to the customer

  • if the device becomes more and more smarter?

  • And our approach was that we tried

  • to figure out how our conversational bots can

  • be used to keep in contact to our devices.

  • And how does it work?

  • So we have had the answer from a software architecture

  • perspective, many for layers.

  • On the left side, you can see the interaction

  • of the user integrated in Skype, Slack, Facebook Messenger,

  • whatever, your own home page.

  • And the user can send the message to the API.AI chatbot.

  • That chatbot processes the intent, as shown by Dan before,

  • and the intent detection fires then a HTTP call

  • to the WebHook.

  • And the WebHook is now a piece of software we have developed,

  • which maps the API.AI intent to a domain-specific API call,

  • so which will call to our boiler.

  • And the boiler can, in that case,

  • set a new room temperature setpoint to a new value,

  • respond with a HTTP, OK.

  • And then we can prepare a fulfillment message

  • in our WebHook to respond to the message of the user.

  • API.AI then routes that message back

  • to the originated messaging application.

  • And then it's done.

  • The boiler has set a new temperature.

  • And hopefully, the user [INAUDIBLE].

  • So let's have a look on the demo.

  • So back to the demo.

  • So here I have a demo set up on the left side.

  • You can see our internet portal, which is already available.

  • So you can control your heating appliance.

  • Can set some values, for example, the current room

  • temperature setpoint, the hot water setpoint

  • for your shower and bath.

  • And you can, for example, prepare a holiday mode.

  • If you are leaving the home, the device is going to stand by.

  • And you can save a lot of money if the heating

  • device is switched off.

  • On the right side, I have connected here my iPhone.

  • And Skype is opened.

  • And you see, like in the demo before with the Facebook

  • Messenger, you have a bot us a chat contact.

  • And then you can say, OK, for example I will be on holiday.

  • And I have used Siri to transfer speech to a chat message.

  • And then a required parameter is missing.

  • To call the API of the boiler, I need the date period.

  • And the bot asks me for that.

  • Next week.

  • I will send this next week.

  • And the magic in API.AI transfers from then next week

  • in a date period, which could be processed by our API.

  • And next week, hopefully, yeah.

  • it's transformed in a date.

  • And if our portal works fine, that we have to refresh it.

  • Holiday mode is switched on with the desired date period.

  • And we are done.

  • So use conversational--

  • [APPLAUSE]

  • --UX-- thanks-- to keep in contact with your device.

  • And that's a good possibility for us.

  • So let's dig a little bit more into development issues.

  • How do we have done this?

  • It's the same setup we have seen by [INAUDIBLE] before.

  • We have different use cases, so we

  • can activate the holiday mode.

  • You can set or get the auto temperature.

  • You can ask the boiler, if you want to take a shower,

  • if the hot water is prepared, and so on.

  • And you can set your schedule of the boiler with buttons.

  • And yeah, that's fine.

  • But let's stick to the activate holiday mode.

  • I have prepared some [INAUDIBLE].

  • And yeah, then I show this already.

  • And I want to require a parameter.

  • It's the date period.

  • And then I have switched on the WebHook for the [INAUDIBLE]

  • presentation.

  • And I want to use it here.

  • And now I take the JSON here, which

  • is processed, and copy it out.

  • And then keep this in mind.

  • And now I switch to our back end implementation.

  • Oh, my VM is switched off.

  • So let's be patient till it's up and running.

  • So we are developing our back ends .NET.

  • And we're using Visual Studio for that.

  • And it was really easy for us to use API.AI.

  • Oh, sorry for that.

  • Yeah, let's keep on talking.

  • Corporate networks are not a good thing for developers.

  • OK, how to integrate within a .NET solution.

  • I have a Web API project up and running.

  • And the first thing I have to do is I use the NuGet Package

  • Manager.

  • And I add the NuGet package from API.AI.

  • So they provide many integration SDKs for node, for JavaScript,

  • for everything.

  • So that's the first thing.

  • I added that NuGet package.

  • Then I prepare my controller.

  • Here you have the root prefix of that controller,

  • which receives them.

  • The POST message here, it's the POST annotation.

  • And you get a JSON body out of that POST message.

  • And here, it's the model from API SDK, which is

  • provided by the NuGet package.

  • And that model is then in a form,

  • which you can process easily.

  • And here I have added some if/else switches.

  • You can also use a switch statement.

  • Doesn't matter.

  • And how to test it now locally on your machine

  • before deploying it in the cloud and connecting it directly

  • to the chatbot.

  • So I use here now the debug mode locally on my laptop.

  • And use Postman or you can use any other REST client.

  • Paste the JSON in you have formerly copied out

  • of the API.AI console.

  • And then fire that POST to the local host.

  • You can see it here.

  • It's now my local WebHook.

  • And [INAUDIBLE] back to Visual Studio.

  • Hopefully, yeah, it holds on the line where the activate holiday

  • mode intent is processed.

  • And the action name was called activate holiday mode.

  • And now I have to read out the parameter, so the date period.

  • And you have your [INAUDIBLE] structure.

  • It's well-formed.

  • And you have the parameters.

  • And you see, OK, yeah it's very small.

  • But you have one parameter.

  • It's named date period.

  • And it has that value.

  • And you can fetch out that key value

  • pair out of that structure.

  • And then I call--

  • I have a API Manager class, which

  • separates all our domain-specific API away

  • from the controller.

  • Then I call the set holiday mode,

  • activate it, and activate the date period.

  • So that's the domain-specific API call.

  • And then I prepare the fulfillment.

  • And the fulfillment is the response

  • of the chatbot to the user.

  • And here I do it mostly like then before in the console.

  • I replace a placeholder key with the date period

  • the user was expected.

  • And then I return OK.

  • So HTTP 200.

  • And if I go back to the Postman, then you

  • can see here the fulfillment, which is also JSON,

  • and which is sent back later on if you deploy it to the cloud

  • through the API.AI chatbot.

  • And that's, overall, the thing we have prepared.

  • We try to launch this in the next two months

  • with the MBP approach.

  • So test it with a few users.

  • We are record what the users say to the bot

  • to learn how we should improve our utterances, which

  • fires the intent.

  • And then we are, yeah, excited to see

  • the response of the users.

  • And we start to deploying it not to Facebook and to Skype.

  • We start to adding that chatbot to our internet portal,

  • so we have an interface, HTTP interface through API.AI.

  • And then you can use it, also, in your web portal.

  • So API.AI was really seamless.

  • It takes two or three days to set it up

  • and running with our real boilers.

  • I have to prepare that.

  • I switched off the holiday mode later on.

  • Because my wife says, if you come back and it's cold,

  • you're fired.

  • So we have out of the box language support.

  • And that was really a good language support,

  • even for European languages.

  • So we can use German as a language, and Italian, French,

  • out of the box with API.AI.

  • Other solutions, that's not fulfill that requirement.

  • And here we do not have to translate

  • the speech with Google Translate before sending it to API.AI.

  • We can directly use German or Italian or French

  • as a language.

  • So thanks.

  • I would like to welcome Petr from ING to show the next demo.

  • [APPLAUSE]

  • PETR NETUSIL: Thanks, Cornelius.

  • All right.

  • Excellent.

  • So my name is Petr Netusil.

  • I work in applied machine learning department at ING.

  • Let me figure out how this works.

  • Works good.

  • So there we get to experiment with all sorts of fun

  • new stuff, like AI machine learning NLP.

  • And I'm excited to show you today

  • what we have been building with API.AI.

  • And hopefully, give you some tips

  • on how you can get started yourself

  • and something on top of what has been shown here.

  • But well, first, ING, big global bank.

  • But actually, my boss has recently

  • realized we're not a bank anymore.

  • We are an IT company in a financial industry.

  • And as an engineer, I find it excellent.

  • I find that transition great.

  • And good thing they also didn't just said it like that,

  • but they invested almost a billion dollars,

  • if I calculate-ish correctly, into digital transformation

  • to make that happen and to make sure

  • that we are on the platform as our clients are as well.

  • But anyway, let's go to the deep, gory details

  • of the solution.

  • So this is our architecture.

  • And maybe good to know, this is also our bot.

  • It's Marie, our digital assistant.

  • So she can help you with everything about the card.

  • So that is the kind of domain we have selected for a pilot.

  • And what you see perhaps immediately

  • is that the architecture's different.

  • So normally, the traditional architectures

  • would talk to API.AI directly and then do some WebHook

  • fulfillments out of that.

  • But we said, no, no, no, we want control, we want security,

  • we want logging, we want to do it better.

  • So we actually place our conversation logic

  • in the middle.

  • So I'll fly you through how this works.

  • So you start from any channel, Facebook Messenger, Skype,

  • or some other channel.

  • And then you go into the conversation logic

  • that will basically route with all these different APIs.

  • So if it's just API.AI discussion, you talk to API.AI.

  • So everything, again, happens via some REST calls, HTTPS

  • [INAUDIBLE] requests, like you have seen before,

  • or you can route the sentiment analysis, which is something

  • I'm going to show you.

  • And we are very excited about too.

  • So that's based on the Google Natural Language API.

  • We also built a sentiment analysis and intervention

  • dashboard, which is based on WebSocket.

  • So you have a real time kind of connection between the server

  • and the client.

  • We have also Twilio for authentication.

  • And lastly, and for us at least more importantly,

  • we hooked it up all to our ING APIs

  • and we will get live data from this moving forward.

  • So if my corporate laptop will work

  • and we can very kindly switch to it.

  • [LAPTOP BELL]

  • We have even the sound.

  • Very good.

  • [INAUDIBLE] number.

  • Oh, we're perfect.

  • So Marie, our bot, well, let's just start with her.

  • So what she will ask you immediately is,

  • you can get started about questions and requests

  • about ING cards.

  • Now, typically what we see in Belgium a lot is

  • that a lot of people are just calling us, like, hey,

  • I don't know where is my card?

  • And we get lots of these calls.

  • So they would ask, hey, I ordered it last week.

  • And now it's not it's not there anymore.

  • And did you send it?

  • Did you not send it?

  • So one incidence we have programmed

  • is, where is my card?

  • And Marie, what she'll try to do now is to authenticate me.

  • Because we don't want just everyone to know these details.

  • So our demo profile Robert has born on 1st of May 1967.

  • Now, what happens is we're going to match the user uniquely

  • based on the first name, surname, and the birth date

  • and get a one-time verification token.

  • Whew, yes, we got it.

  • The demo gods are with us today.

  • And I can put something grand on [INAUDIBLE] it won't work.

  • But let's be nice to Marie and put the right one.

  • So it's 129335.

  • Right.

  • So what happens now, we get authenticated

  • for the next 10 minutes.

  • So you don't need to push these tokens back and forth

  • all the time.

  • And we get the right delivery address details.

  • So this is one of the use cases I wanted to show you.

  • But the thing is we are extremely

  • excited about bots ourselves, but maybe not everyone is.

  • They are still young and they're still learning.

  • And what will theoretical happen or more practically

  • is that someone dislikes it.

  • So saying, I know this bot is terrible.

  • My god!

  • [HORN BLOWING FAILURE]

  • What is that?

  • Oh, so it's our sentiment analysis dashboards

  • are triggering that.

  • Thank you.

  • [APPLAUSE]

  • If you have ideas for a better sound,

  • please talk to me after the session.

  • Right.

  • So this is basically the web circuit

  • I was talking to you about.

  • So there is this server to client integration,

  • which triggered that.

  • So let me deal with it.

  • So here you see transcripts or, well, a dashboard

  • of all the different sessions today or now is just the one.

  • And you see the different sentiment, which is happening.

  • But what is interesting here is that I can actually

  • go inside that full chat transcript.

  • I see everything here.

  • And I can boss the bot.

  • So what I'm going to do, I'm going

  • to stop Marie and just start the conversation as

  • if I would be someone controlling and managing

  • that online conversation.

  • So hi, this is someone alive.

  • I want to get out.

  • You know, it's going to be a little bit silly now,

  • because I'm going to have that conversation with myself.

  • But the point is API.AI is responding now.

  • And I have a full control of what is happening, OK?

  • So you see that point.

  • Now, to satisfy the development and coding people

  • amongst yourselves, let's actually

  • see how ridiculously simply you can implement this yourself.

  • So our conversation logic is built on node.js.

  • And this is how you can require the Google NPM package.

  • So hopefully, you will use this commented line and not

  • this hacked way, which we used over here.

  • And then you just call a method that is the detection method.

  • And you wait for a callback.

  • And you get a callback back after some time.

  • And then you do whatever you want with that sentiment value.

  • So it's extremely simple.

  • And I think the guys at Google done a great job with that one.

  • Right.

  • Let's go to the API.AI and how we implemented this.

  • So what I wanted to show you and how

  • we used to trigger the conversational logic

  • is like this.

  • So say, where is my card or something else.

  • The JSON line you are interested in is this one.

  • So there is an action.

  • And just using that action, you can program anything manually

  • for fulfillments for further on.

  • But what is maybe more interesting here,

  • and I'm going to take you to our development Marie,

  • is that once your bots are going to be scaled further

  • and you are going to build a lot of different intents

  • and conversations, it will get messy.

  • It will be back and forth.

  • And people will not just follow the preprogrammed line.

  • So you want to be able to still navigate

  • within that conversation.

  • Now, one of the ways we found, at least for ourselves,

  • how you can do that in API.AI is to create, first,

  • some kind of a naming convention,

  • because otherwise it's a big mess.

  • And understand that for one use case,

  • there is actually many intents which you can use.

  • So say there is a use case activate card,

  • but one kind of a use case is more intents

  • which you want to trigger.

  • So we're going to start the first intent

  • and then pass this context up until the last one.

  • So let's see how it looks.

  • So you see that the first intent has no input context, but just

  • the output context.

  • And then you want to pass those along the conversation.

  • Here are the [INAUDIBLE].

  • Typically, you would put hundreds of them, which we

  • have in our to-be-live version.

  • And then what is also interesting

  • here is managing the conversation.

  • So here we put the response without a parameter and here

  • with a parameter.

  • And only if you have the parameter, as Dan showed you

  • before, it will be triggered.

  • All right.

  • So I hope you guys liked our little demo.

  • I'm very happy to connect with you during the conference.

  • And yeah, thank you for watching.

  • [APPLAUSE]

  • So we're going to have a bit of a Q&A.

  • So I'd like to call Cornelius and Dan back to the stage.

  • And, well, fire away.

  • And the nice guys who were there asked us

  • if you can come back to the mic, so that everyone can hear you.

  • DAN AHARON: We can probably switch back to the slides,

  • I guess.

  • PETR NETUSIL: Yeah.

  • Questions.

  • DAN AHARON: If anyone has a question, please, yeah,

  • come up to the mic.

  • PETR NETUSIL: Yeah, or just shout very loud.

  • AUDIENCE: Hi.

  • Thanks for the great demo.

  • One question is, what do you do about analytics?

  • You say the API.AI is a complete system.

  • I didn't see any analytics link.

  • DAN AHARON: Yeah, that's a great question.

  • So we don't have any analytics built into API.AI right now.

  • There's a bunch of other vendors that we work

  • with that provide analytics.

  • If you connect with me offline, I'm

  • happy to recommend a few to you.

  • PETR NETUSIL: Well, the way we approach it, we lock everything

  • and we have our data scientists get crazy on it.

  • So maybe that's a way how you can do that too.

  • AUDIENCE: And sorry, question for you.

  • You saw you interrupted the Facebook conversation

  • and had a manual intercept.

  • How did you do that?

  • PETR NETUSIL: Manual intercept of what?

  • AUDIENCE: The conversation.

  • You sort of had a manual intercept

  • into the bot conversation.

  • How did that work?

  • PETR NETUSIL: So it's in node.js.

  • But I've said that there is a global parameter for all

  • the sessions you can set up, whether it's POST or not,

  • and then you manage manage it based on that.

  • So I can show you later on, if you want, in detail.

  • AUDIENCE: OK, thank you.

  • Thanks.

  • PETR NETUSIL: Anyone else?

  • CORNELIUS MUNZ: Hey, come on up.

  • PETR NETUSIL: There is actually a line.

  • That's great.

  • AUDIENCE: I have a couple of questions.

  • The first one is regardless [INAUDIBLE]

  • figure out the intent.

  • Can I have two intents [INAUDIBLE]

  • technically take the same number of entities?

  • And how does it figure it out?

  • Let's say that I have one intent that

  • is supposed to act on a negative set of words and the other one

  • acts on positive.

  • DAN AHARON: Yeah.

  • So it learns by language.

  • So if you set up the intent with examples of negative sentences

  • and a set of entities, and then a different intent that

  • has positive sentences and even the same entities,

  • API.AI knows how to figure out which intent to choose

  • according to your examples.

  • AUDIENCE: And can you load the entities from, let's say,

  • a transactional table that you are continuously

  • updating all day long?

  • DAN AHARON: Yes.

  • Yes, you can import or export from the UI.

  • And also, there's a full back end API

  • that you can access everything programmatically,

  • which we didn't show today.

  • But it's all documented on the website.

  • AUDIENCE: And does that require retraining every time there's

  • a new entity on that table?

  • DAN AHARON: Yes, but the training happens very quickly.

  • In fact, the model was trained multiple times during my demo.

  • It happened in the background and you didn't even notice.

  • AUDIENCE: OK, thank you.

  • DAN AHARON: Sure.

  • AUDIENCE: Awesome demo.

  • Thanks, guys.

  • PETR NETUSIL: Some [INAUDIBLE] here.

  • AUDIENCE: I had a question on the difference between the toy

  • applications and when you went to production.

  • It seems like you'd have common [INAUDIBLE] explosion

  • in Natural Language.

  • And how did you handle that?

  • You said you had hundreds of the templates up there.

  • PETR NETUSIL: So we are not live yet.

  • We will be live in a few months.

  • But the way we have approached it is we

  • have lots of user sessions.

  • And we basically had people coming out

  • and say, just tell me, how would you ask this question?

  • And we have, yeah, I think we have hundreds

  • and hundreds of-- you know, just per one small intent,

  • hundreds and hundreds of utterances

  • programmed over there.

  • Try to get a trend.

  • I think it's easy and fun to do a bot,

  • but it's difficult to do it right.

  • So this is one of the parts where, yeah, you

  • need to do it right.

  • DAN AHARON: If you're interested,

  • you should check out the third session, which

  • is about actions on Google and talking about how to build

  • conversational actions.

  • They're going to talk a lot about lessons

  • of how to build a right bot and right conversational interfaces

  • that think about all those different expressions.

  • One thing we see a lot of people do

  • is use cards and buttons, which sort of help

  • simplify the experience for some users.

  • But make that optional.

  • So give the users the option of whether they

  • want to use the cards or whether they want

  • to type in Natural Language.

  • AUDIENCE: Would the semantic analysis help in that regard?

  • DAN AHARON: Yeah, yeah, some users love to type.

  • And they want to express themselves that way,

  • while others just want the fastest way of getting

  • things done, right?

  • So you give them both.

  • And then they pick whatever they want.

  • AUDIENCE: Thank you.

  • DAN AHARON: Sure.

  • AUDIENCE: Hi.

  • I'm not sure I know how to ask this question right.

  • But every demo that I've ever seen on conversation

  • relies always one-on-one.

  • What about the bot being kind of a moderator of a conversation

  • between multiple people?

  • DAN AHARON: Good question.

  • So the way that it's structured is

  • it's built on a conversation.

  • And it has, as context, all of the prior conversation

  • that happened.

  • So if you have a conversation with multiple people and one

  • bot, then as long as your intention is

  • for it to remember what everyone else is saying,

  • then you're good to go.

  • Then it doesn't change anything in the logic.

  • AUDIENCE: Yeah, like if you were trying to figure out, hey,

  • we're going out for dinner and here are five different places

  • we want to go--

  • DAN AHARON: Yeah.

  • AUDIENCE: --that sort of thing.

  • DAN AHARON: Right, like Google Assistant--

  • AUDIENCE: Yeah.

  • DAN AHARON: --and Google [INAUDIBLE].

  • Yep.

  • AUDIENCE: Yeah, yeah.

  • All right, thank you.

  • DAN AHARON: Sure.

  • AUDIENCE: Hello.

  • Great talk.

  • Great demo.

  • DAN AHARON: Thanks.

  • AUDIENCE: I liked that question too.

  • I guess, two things.

  • One, are there push notifications with this yet?

  • Could you set something that eventually the chat will just

  • chime in when a counter goes off or you set a reminder?

  • DAN AHARON: So I don't think so.

  • But let me get back to you.

  • Are we back to the slides?

  • OK.

  • Send me an email.

  • And I'll get back to you.

  • PETR NETUSIL: The way you can solve this is actually

  • to trigger it--

  • at least in our case, to trigger it

  • from your conversational logic.

  • So that's why we put our thing in the middle to say,

  • hey, psshht, let's fire up a push notification.

  • AUDIENCE: So you would say like, show

  • me the status or something, and then you would get back?

  • Is that what you're saying?

  • PETR NETUSIL: It's from an application side, not from API

  • directly.

  • AUDIENCE: Oh, I got you.

  • Yeah.

  • And a quick question about deployment.

  • So you had a dev environment and production,

  • would you just edit the interface

  • or do you version control that?

  • PETR NETUSIL: Good question.

  • DAN AHARON: Cornelius?

  • PETR NETUSIL: Oh, Cornelius [INAUDIBLE].

  • CORNELIUS MUNZ: I didn't get it right.

  • So can you repeat it?

  • AUDIENCE: Oh, so you had a dev environment and prod,

  • do you version control the configurations?

  • How would you deploy your dev to prod?

  • CORNELIUS MUNZ: Yeah, we have version control or we use Git.

  • We have different environments, staging, testing, and stuff.

  • And yeah, it's a continuous integration environment.

  • AUDIENCE: OK, cool.

  • PETR NETUSIL: So I think there's version control many levels.

  • On the code, we use Git, like any other person in the world.

  • But for API.AI directly, we just created different bots.

  • And what you can do is just extract one bot.

  • Just export everything there is and just

  • import it in another bot.

  • CORNELIUS MUNZ: So it can export as [INAUDIBLE] file.

  • PETR NETUSIL: It's export/import.

  • CORNELIUS MUNZ: So all--

  • PETR NETUSIL: [INAUDIBLE]

  • CORNELIUS MUNZ: --yeah, configuration

  • of the bot itself, not your code, so I didn't get it.

  • So the configuration of the bot can be exported as a zip file.

  • And you can store it safe within--

  • and you can import it later on with intents, entities,

  • contexts all available.

  • AUDIENCE: Oh, perfect.

  • CORNELIUS MUNZ: And at the end, it's a JSON file.

  • AUDIENCE: Oh, cool.

  • All right, thank you.

  • AUDIENCE: [INAUDIBLE]

  • PETR NETUSIL: Manual.

  • Yeah.

  • CORNELIUS MUNZ: Manual.

  • AUDIENCE: So one of your early slides

  • showed Cortana and Alexa.

  • DAN AHARON: Right.

  • AUDIENCE: Is it trivial for me to create

  • a bot that then interacts with Google Home, Alexa, Cortana,

  • a trivial connection?

  • DAN AHARON: Yeah, so their interfaces

  • are slightly different, right?

  • And so the data that they send to other applications and back,

  • the format is a little different.

  • But we've built our connectors to try and minimize

  • that sort of interruption as much as possible,

  • so that our developers can focus on the actual conversational

  • logic.

  • So for the most part, you can build one logic

  • and connect it to multiple connectors.

  • AUDIENCE: It'll connect

  • DAN AHARON: Yeah.

  • Now, the caveat is when you're talking to Google Home

  • by voice versus if you're texting

  • on Google Allo or Facebook Messenger,

  • the users speak sometimes in a different way,

  • use different words.

  • And the way you want to talk back to them is also different.

  • Like on a chat, you might want to show them cards with a map

  • or maybe you want to write a long paragraph of text.

  • When you're reading back on Google Home,

  • you don't want to read back a long paragraph of text,

  • because users will get bored, right?

  • So you actually may want to build

  • different conversational experiences for each one.

  • But if you wanted to create a single experience, you could.

  • CORNELIUS MUNZ: And from a technical point of view,

  • you have to make a trust relation between Facebook

  • and API.AI.

  • And you would generate an API key on Facebook or Skype

  • or whatever and copy that API key into the API.AI console.

  • And that's all so a trust relation is there.

  • And then Skype can talk to API.AI

  • and respond to that message.

  • AUDIENCE: Thank you.

  • Thank you, both.

  • DAN AHARON: OK, two minutes left.

  • So maybe one or two questions.

  • AUDIENCE: Good afternoon.

  • I was trying to build a toy project the other day.

  • You each use your building address, street address entity.

  • I noticed that in our test data, there

  • is certain address that's a rather unusual construct.

  • Never get recognized.

  • So add those to the training data.

  • But it just doesn't seem correct that.

  • So what we recommend developer to do for a situation like this

  • is when the black box fails.

  • DAN AHARON: Yeah, so it might be a bug.

  • So I would go--

  • we have a community forum.

  • I would report it in the community forum.

  • And we can look into it and see if it's a bug.

  • AUDIENCE: Thank you.

  • DAN AHARON: Sure.

  • AUDIENCE: Yeah, so my first question

  • is, is there a way to resume a conversation

  • or it's all within one conversation?

  • DAN AHARON: So that was on yours, right?

  • PETR NETUSIL: Yeah, it could be.

  • So if you mean the bot boss, you can just

  • flip it back and forth.

  • So as an agent, you have full control.

  • You can just start it, stop it whenever you like.

  • And secondly, every session we have with a bot is in memory.

  • So we remember everything you did, you said,

  • and all the context you have had.

  • So you resume anytime you'd like.

  • AUDIENCE: OK, cool.

  • And then more about the development life cycle.

  • You said we can export or import the configuration.

  • What about, is the training data part of the configuration?

  • PETR NETUSIL: Yeah, all the utterances, like the hundreds

  • I was talking about, is that.

  • AUDIENCE: And OK, just to confirm,

  • the last question is, what level of certainty,

  • giving a certain training data will the response

  • will be always the same?

  • Or it's behind the scene we don't know?

  • DAN AHARON: So it's machine learning-based, right?

  • So the more training data you give it,

  • every time you give it more data,

  • it could change the potential answer.

  • But we do give a confidence score with the results,

  • so you can see sort of [INAUDIBLE].

  • CORNELIUS MUNZ: It's part of [INAUDIBLE].

  • And you can see the confidence level, it's 80%, 90%, or lower.

  • And you can decide by your own, if it is lower, to give a--

  • DAN AHARON: We're at time.

  • But thank you, everyone, so much for coming.

  • PETR NETUSIL: Thank you.

  • DAN AHARON: This has been great.

  • And have a great rest of Google Cloud Next.

  • Thanks, Petr, Cornelius.

[MUSIC PLAYING]

字幕與單字

單字即點即查 點擊單字可以查詢單字解釋

A2 初級 美國腔

你的下一個應用可能是個機器人!用API.AI構建對話式UX(Google Cloud Next '17)。 (Your next app might be a bot! Building conversational UX with API.AI (Google Cloud Next '17))

  • 103 14
    Cai Xin Liu 發佈於 2021 年 01 月 14 日
影片單字