Placeholder Image

字幕列表 影片播放

  • good morning.

  • So first of all, before I start, I really want to think everybody involved in making just confusion happen for having me and for really organizing a really cool event so far.

  • And thank all of you for being here this morning.

  • It might be a bit early for some of you.

  • So thanks for coming here to you.

  • Listen to me.

  • Talk about machine learning forefront and developers.

  • So before I stop, I've given this talk a couple of times before, and the feedback or comment usually get is that I speak very fast.

  • So I'm gonna try to be mindful and slow down a little bit on some slides.

  • But I get really excited about sharing what I know with you saw.

  • My natural pace always comes back on.

  • And also, I mean, uh, I'd like to remind you that we're in a fast paced industry, so I'm not, You know, I'm not fast.

  • You're slow.

  • So I'm getting everything on a try.

  • But so my name is Charlie Gerard.

  • I am a developer at a product company called at last year in in Sydney, and outside of that in the community.

  • I'm also part of to take community programs called Google developer Experts and magilla take speakers.

  • If you're interested in learning more about what it is, I know that there's a few others here as well.

  • So there's quite a few community of people sharing knowledge through conferences and look posts and workshops and things like that.

  • But I'm not here to talk about that.

  • I'm here to talk about machine learning, so usually I start because I don't know what you know and what you don't know.

  • So I usually start with an example to try to put everybody on the same level about what machine learning is.

  • But yesterday I think Money Car did a very nice introduction, talking about how to recognize our pictures of cats and dogs, and I thought that was really simple and well done.

  • So I kind of remove some of my slides.

  • And I'm just gonna briefly say, if you were not here or if you, you know, don't remember that it's basically giving the ability to computers to find patterns in data without being explicitly programmed.

  • So if I just take back the example from Monica yesterday, the first time that a computer sees a picture off a cat.

  • It doesn't really know what it is.

  • It's never seen one before.

  • But you know, by giving it millions of samples of data off what a cat looks like in a dog looks like using algorithms and machine learning models, the computer is able to understand, to find patterns in what makes a cat a cat on a picture and what makes a dog a dog.

  • So at a high level, this is basically that.

  • So at the core of it is machine learning models, and to build a model, you need algorithms.

  • So let's dive a little bit into this.

  • So if you start looking into machine learning, you might come across some of these algorithms.

  • So I know that I when I started, I definitely heard a lot about my face and Kenya Rhys neighbor.

  • And there's really a lot of them, and they're good at solving different types of problems and dealing with different types of data.

  • So before even looking into what algorithm you want to use, you have to understand what type of problem that you want to solve.

  • So there's three main types of problems.

  • Problems in machine learning.

  • You have supervised learning, unsupervised learning and reinforcement.

  • Learning for this talk, knowing that only have half an hour.

  • I'm only gonna cover, supervised and unsupervised.

  • Also, because a lot of the time when you get started, the problems that you want to try and solve can be solved with supervised and unsupervised.

  • So at first, maybe don't bother with understanding what reinforcement is.

  • So let's start with supervised learning.

  • If we start with a really short definition, it's basically creating a predictive model based on a set of features and labels.

  • And there's three main terms in this definition that you need to understand.

  • It's, you know, what is a predictive model?

  • What's features and what's a label.

  • So I don't know about you, but I learn more when I can kind of, like relate to examples.

  • So a classic example of a supervised learning problem is predicting the price of the house.

  • So let's imagine, you know, we're in Singapore.

  • We have a house if we're lucky enough that we want to be able to sell, and we don't know how much we should sell it for free to, you know, for us to make perfect and you know said it fast and things like that.

  • But what we have is a lot of data of all the houses in Singapore, and that's what we have.

  • So we start with that.

  • So how can we predict the price off our house based on the price of houses in the market?

  • So the label is in that case is the price.

  • It's what you're actually trying to predict.

  • So in your data set that you're going to put together, it's Hollywood Classify each entry.

  • So as our problem is trying to predict the price we would have.

  • If, you know, imagine an Excel spreadsheet.

  • We would have a condom with all the prices off all the houses in Singapore, and then we would have.

  • Our features and features are the characteristics of the entries in your data set.

  • So four houses it would be the number of floors, the number of rooms.

  • Bathrooms doesn't have a garden.

  • Oh, no, the neighborhood and things like that.

  • And using these labels and features or using an algorithm, you would try to understand the correlation between the features and the price for the characteristic and the price of the house to generate a model that can then end up being a mathematical representation on the outcome of your training.

  • So using everything that you know about the houses use algorithms that end up being like a function that takes new parameters of a new house.

  • It's never seen before, and it's able to generate ah prediction off a price that the house should go for, based on all the houses that it's seen on the market.

  • So it's important to know that supervised learning needs labels and features to generate a predictive model.

  • And then, if we look at a type up, another type of learning the unsupervised learning, it's the ability to create predictions based only on a set of features.

  • And the end of the definition is really important because it's the main difference between unsupervised and supervised.

  • So if we think about our problem of predicting the price of a house, it wouldn't really work as an it's not an unsupervised problem.

  • How can you understand?

  • How can you predict the price of a house if you have never even seen what the price is?

  • You only know the characteristics of a house.

  • So in that particular case, a type of problem that unsupervised learning solves is more around clustering, so an example of that would be predicting customer behavior.

  • So when you go to the supermarket, they might not know who you are at all.

  • But based on what you buy, how often how expensive it is, where they can actually really cluster people together into different categories and based on your buying behavior, they would then be able to know what advertising to send to you because you're more likely to buy the same thing that people like you have bought as well.

  • So so unsupervised learning is more around clustering rather than getting a particular answer at over question.

  • So in this, too.

  • So now that we talked about the types of learning I talked about, you know, problems that you're predicting the price of a house or customer behavior.

  • But it might not be something that you want to do.

  • And yesterday in Monica stock, we saw that machine can be used in a lot of more creative ways.

  • So let's talk briefly about different applications.

  • I just put like a short lease together, and the 1st 1 I'm going to talk about is this example here where you upload a picture of a cat and then on algorithm generates for you a piece of text that you can use as all text for your image tags in your is Jamil.

  • So this experiment was built by Sarah Dresner.

  • I put the link there if you want to try a bit later.

  • But so maybe for one image, my deal.

  • Oh, I can do it myself.

  • But when you like, you can see that using machine learning can maybe eventually automates some of the things that we do as developers.

  • So we can actually spend time, um, solving some of the logic that we want to solve.

  • So that's just like a quick example.

  • Another one that would have been really useful for me a few years ago I was.

  • It's this framework called not safe for work Gs where again you upload a picture in in this little experiment here, and it gives you the probability of being an image that is not safe for work.

  • And I used to work in advertising, and when you work for brands, sometimes they want to put competitions together where they want to engage with customers.

  • So people have to upload a selfie or whatever to win a contest.

  • So people approved a lot of images, and most of the time they brain wants to see the images live in a gallery or whatever.

  • And the thing is that the check that you have to do to make sure that these images are okay they usually don't manually, either by by like a product manager or ah or by the developer is so using machine learning in the front end.

  • You could actually do a pre check of what people are applauding before saving into the database and before displaying it live on a page.

  • So that's definitely something I could have used a few years ago.

  • So I thought it was really interesting.

  • And then fight like another example that is really, really, like, totally like, awesome, like I really love it because it makes is a lot of different technologies, and it can actually help people in terms of accessibility.

  • So I don't know if you've seen this one, but this developer called Baby Shake Sing created a prototype around the concept of kind of like future interactions.

  • So he was thinking about if no voice you eye when you talk to Amazon.

  • Alexa, what about people who can't talk?

  • Are they?

  • Are we just gonna totally cut them out of, you know, innovation and things like that?

  • So what he decided to do is to use machine learning in javascript to look at the to capture data from the web cam feed, train an algorithm to recognize and gestures map them two words so he could actually communicate with Alexa just via via gestures like by training and algorithm.

  • So I thought it was really a creative way and really interesting way of using MACHIN owning and jealous.

  • So let's dive a little bit into instead of just talking about what you can do.

  • Um, let's have a look a little bit about.

  • First of all, why?

  • Because when you think about machine learning, you probably think about Python before you think about JavaScript.

  • But I think it's really interesting to now be able to do that in JavaScript because in terms of learning curve, it's a lot easier for developers to know that they only have to understand a few concepts of machine learning and, um, and understand the syntax off in framework rather than having to learn a new, entirely new language.

  • We learn new syntax, like every day when we move from Jake, where you too angular to view to react so we could have used to doing that.

  • And now it's just transferring that skill of being, you know, adaptable to another domain.

  • So I really love the fact that you don't have to be that a scientist.

  • You don't have to only know Python.

  • It's actually out there and you can start doing it now.

  • You know, the point that I think is important is around rapid prototyping.

  • So I really love doing that.

  • It's, you know, like this.

  • What I'm talking about has nothing to do with what I do for work.

  • So all of my experiments are things that I do outside, you know, after work, you know?

  • Yeah, or on the weekend.

  • So sometimes when I want to valued an idea I needed to be fast because I don't want to spend I'm not that patient.

  • So I don't want to spend six months on the project.

  • Usually want to be in, like, two hours.

  • Eso being able Thio two prototype something fast in Java script is really cool, because if you want to, just, you know, if you have an idea in mind, and often if you want to be given the time or the money at work to exploit more, you have to convince people that it's worth it and to be able to convince them.

  • Usually you have to show them something some kind of M V P or a proof of concept, and ah, and you being able to do machine learning And JavaScript means that you could valid an idea and then eventually moved to Python.

  • If you want to push it further.

  • But I think that's one of the really, like, awesome thing that I like about you.

  • And then finally there is a bigger and bigger ecosystem of tools that you can use in jail a script, and there's more and more documentation.

  • There's a lot of tutorials and courses, so I think it's a good time to get started if you want to get into that space.

  • So what can you do so you can do three things?

  • First of all, you can import on existing pre trained model, so it's a model that has already been trained with a lot of data for a specific purpose, and you can just imported in your app and use it as is.

  • You can also do.

  • You can also retrain and unimportant model that we call transfer learning, so you kind of do the same thing from the step one you can you can use.

  • The model of that is open source.

  • You can as you you can add your own samples on top of it and kind of retrained that really fast.

  • And then, finally, you can do all of it in JavaScript in the browser you can defy in train and run your model entirely in the browser.

  • So in terms of tools, I think the most popular is tensorflow Js.

  • But sometimes I feel like some of the terms that tensorflow Jess uses might be confusing for people who is just getting started.

  • So I really like the 2nd 1 on the right ml five.

  • Jess that it's a framework that tries to abstract some of the terms like, you know, it's trust to make it really, really easy for you to get started.

  • So it's a lot simpler, and as a few examples I feel like it's No, it's still kind of like in construction.

  • There's some examples on something a great things that work.

  • But sometimes when you look at the docks, you can see that it's still in progress.

  • Um, I definitely have a look.

  • And then if you don't really want to write an equally but just wanted paying some AP eyes, you can leverage the tools from Amazon, Google and Microsoft, and then the the M in the middle is for much energy.

  • Es eso.

  • Now, if you were here yesterday, you know what that is?

  • But yes.

  • So it's like some some frameworks are actually more specialized about certain type of data and to do certain things already.

  • Okay, so we are at this part where I'm going to do some demos and code because I feel like to get people on board.

  • Sometimes, you know, you can't just say, Oh, it's easy.

  • You kind of have to hopefully show that it's easy, so people are a bit more comfortable.

  • So to get started, I'm going to talk about the quickly move that I build around using a pre trained model.

  • So, um, when I did so I I want to get better at recycling.

  • But I always forget what should go in what been.

  • So that's that's not really helpful when you want to recycle.

  • Um, so I wanted to to see if I could use machine learning to help me do that.

  • You know, it kind of do it automatically.

  • So I'm gonna demo write what?

  • I didn't and I show the code.

  • So this demo is supposed to I kind of built it usually to be used on the phone so I could just, like, move around the desktop like it's working because it's a website.

  • But, um, let me just All right, So I'm going to start.

  • It's going to need the camera feed, and I have a rest sticker hiding my camera.

  • I need to remove it.

  • Ah, you need to buy myself a really, you know, camera thing.

  • Put it back.

  • Okay, so if you've ever all right, so I have started the camera feed, So you have to imagine that you have it on the phone and it's not pointing at you, but so I just have, like, a button that here that says, you know, easiest recyclable.

  • So it works well with bottles of water.

  • So I have about a water s I'm going to put it my microphone down because I need Okay, Get out in.

  • Thank you.

  • Yes, yes.

  • So it is a photo, Uh, s so then on my phone, I would click.

  • Yes, it is a bottle.

  • And then it says, Oh, it is recyclable or three in the yellow.

  • Been so in Australia.

  • It's yellow, actually.

  • Didn't check here if it was, You know, uh, well, okay, s a but in there.

  • So you can you can start again with different objects.

  • And I'm going to exit this because I don't need to see myself.

  • Um, so and this one.

  • Okay, um, and India and s So it does the whole image recognition using machine learning.

  • And then you could imagine that you could build an A p I that you know you can just going to say, hey, this object which being treated, going and things like that.

  • So, in terms of code, I used the cocoa SST open source model.

  • That's an object detection model.

  • And then I import tensorflow Jess as well.

  • Then you load the model to be able to use it.

  • Then you get the camera element from the HTML, and you use some of the tents of mobility and methods.

  • Thio.

  • Look at the pixels off that camera feed.

  • You transform the data so that it can be in the format that tensorflow and the model is expecting.

  • And then finally, you kind of give that processed image to the model, and you ask it for predictions.

  • It's kind of like Here's the digging out from the Web cam feed.

  • Have you seen this before?

  • Do you know what this is?

  • Ah, and recognize that you know my bottle.

  • So, um, of course, you know, there's a bit more cord around like the u I and buttons and things, but just to be able to recognize images using that model, that's basically just you load the model, you get the data from the camera feed and then you transform a little bit and you ask the model.

  • You know, can you detect this?

  • So at the first thing of importing a pre train model that so now let's talk about transfer learning.

  • So transporting, as I said before, is using a pre train model but adding your own samples.

  • So for this one s so I like to try and play run with new technologies and mixing them with accessibility.

  • So I'm just gonna launch the Oh, my God, I'm still here.

  • Okay, um, so this one is going to use the camera as well.

  • So see, the camera feed is gonna be really small, but there's a reason for that.

  • Like you're not supposed to see it.

  • I just showed it.

  • So, um, so you would see the what's going on, But okay, so I'm gonna go to the right.

  • I'm adding some polls.

  • I'm gonna go to the left.

  • I'm gonna go down.

  • I'm gonna patrol.

  • Then I am testing.

  • Okay, Lift.

  • Right.

  • That's pretty good.

  • And then I'm able to actually in less than in, what, 30 seconds trained a model to recognize my hand movement.

  • I had really gestures.

  • And I can write and do that and like why?

  • Okay, um so I'm gonna, um all right, so I really like this is like, my favorite feature of using John stripped for machine learning and in terms, of course.

  • So there's a little bit more record that I can't go too much into it because I'm looking at the time, and I think I'm gonna be over time.

  • But I am loading another kind of like image recognition model called Mobile Net.

  • I'm again loading tensorflow, Jess.

  • But I'm also writing a classifier because I need to kind of re classify my my personal samples with the model.

  • Then you just declare some variables.

  • I have four classes I'm trying to classify because I had four gestures, but it could be, you know, the number that you want.

  • The image size of 227 that is hard coded.

  • I started that from an example.

  • I think it's because, um, the bigger the image of the longer it's gonna take to train.

  • So I think if you wanted to be fast than 207 is fine.

  • The top K is the K value for Kenya's neighbor.

  • That is kind of arbitrary as well.

  • You can change it.

  • I can't go too much into what the algorithm actually does.

  • But this value is needed, but it doesn't have to be 10 but then so the more interesting part is you create the classifier, you load your module, then you do the same thing as before.

  • You look at the data coming from the camera feed.

  • But instead of giving it straight to the model, you kind of have to, like, add it with what's going on with the model that you loaded.

  • Um, so I'm no exactly sure exactly what the infirm eth a does before.

  • My understand, it kind of emerges your, um, image data from the browser to the motor Mobile net module.

  • And then he adds all of this to the Kenya's neighbor.

  • Classify that you created before, and then you're able to predict the class of your new samples based on this merger of the model and your personal samples.

  • So then you kind of get a prediction of your current position in my personal simple like ah, and yeah, that's it.

  • So that was a very poor explanation, but in terms of steps you created classifier, you load the module.

  • You look again at the data coming from the web Cam, you kind of do the merge.

  • Aah!

  • And you ask your model to predict.

  • And finally you can dispose of the images that you got from your webcam because you kind of already retrained the classifier so you don't need to keep it.

  • So that was for transfer, learning, And then finally you doing everything in a transcript.

  • So for this one, this demo, I'm always a little bit nervous because I don't know how people are going to take it.

  • But yesterday somebody mentioned Pornhub on stage.

  • I think I'm fine.

  • Uh, what I built is really detector, eh?

  • So we're going to stay on the same level.

  • Um, if you don't know what it really is, Willie is a Penis, but it's not pictures.

  • I'm not crazy.

  • Esso.

  • And also, I'm really not interested in pictures of this, so I'm just gonna launch it quickly.

  • I'm just let me if you just go stop this.

  • It was still going, Um, what has to be someone that should be all right.

  • So basically, I build something from scratch that recognize?

  • I'm going to start with something, you know, like safe.

  • So it's a kindle, just in case.

  • That's a kindle, uh, any If I predict, it tells me it's not a wheelie is like, yes, it's not really.

  • Um, and then I can clear it.

  • And don't you know I don't need two hands with this, So, uh, yeah, and really okay.

  • Ah, and what I forgot to say I forgot to talk about why I built this.

  • Because I don't just spend my spare time drawing willies.

  • So I built this because I wanted to build something else where I wanted to let people collaborate on the browser, you know, using website kids from anywhere in the world.

  • But when you let people throw stuff on the Internet, you can be sure that they're gonna be drawing willies.

  • And I didn't want to spend my time checking.

  • You know, what are people drawing and then deleting it and stuff like that?

  • So that's my way of using mentioning to do that for me.

  • So at the moment, you know, like I have to click to be able to predict if it's a William note, but you can actually change the code to detect as somebody is drawing.

  • So it means that you would you wouldn't even have time to finish.

  • The willy would be gone.

  • S o.

  • I still haven't built that project, but I'm still happy without um So now let's talk about how I put it together.

  • Oh, Okay.

  • So first of all, and because when you do Yeah, I know.

  • And that's the only thing people remember from the talk.

  • So, um so because okay, so when you build, when you do everything from scratch usually have to kind of you have any data, right?

  • So on the right, you have the quick road that I said that we will put together Where they There was a game online where people could were asked to draw, I don't know, a strawberry and a door and glasses.

  • And actually, all these samples were downloaded, so they were actually saved in a database.

  • And Google put this that I said open source.

  • We can use it.

  • The thing is, obviously Google didn't ask people to draw willies, But I needed them for my, uh, you know, for my prototype, because if it doesn't know what it really is, you can tell me if it's one of those.

  • So I drew some of them definitely know as many as the doctor said, because that's like millions.

  • Um, but so I needed that.

  • So I needed to have data together.

  • The thing is that when I drew my own, I hate this light.

  • But so when I dread on the canvas on in the browser.

  • It was to hurt by 200 pixels, but the images and then I said 28 by 28.

  • So for me to be able to work with them usually have to do some kind of that are processing where you have to resize without the whole data set is the same size of pixels.

  • And then finally a fine.

  • Finally, next you do.

  • You split between the training set and the test set where it's usually 80% and 20%.

  • And the reason why you do that is because you use 80% of you that I said to train your bottle.

  • So it kind of understands, is it really a reason, Kendall and then your your test data that is already labeled you already know what it image is.

  • You use that two against your model that you created to see if the accuracy is right, and if it's not right, you know that you have to retrain.

  • But then, after you split your data, then you can choose your algorithm.

  • I don't really think there's a purpose to choose it before, but I in this particular case, I used what we call the CNN competition on your network because from what I read, it deals with with images.

  • So you know, I have images.

  • Then you can go do some parameter joining you train, you get your prediction and depending on you know, the level of accuracy.

  • If you're happy or not with it, you kind of repeat these steps over and over again.

  • You will rarely get a very good accuracy the first time.

  • Perimeter training is kind of like weird as well.

  • Like you, you don't really understand sometimes why changing apartment makes the accuracy better, but you kind of have to play around with it.

  • So in terms of code, I'm gonna show like a really small could sample.

  • I tried to get, like, the core bits of it, but what you have to do intensive floor, you have to create tenses, which is basically like the way a way to shape the data so that tensorflow will understand how to deal with it.

  • And as I said, you have to create a training set and tests it on.

  • Then you create your model.

  • So I have a sequential model and I add layers of ah ah convolution able to d layer and then you add differently is as well.

  • I don't have time to really go into this in depth, but also the amount of layers is a laboratory.

  • Sometime you can remove some, add some, see how it performs.

  • The perimeter's training that I was talking about is what you see, like colonel size and feel, Tues.

  • That's a little bit more of like an art than a science.

  • You kind of see what works.

  • Then you get your training data, you do what we call fitting.

  • So you give that training data and labels to the model, you pass it a few perimeters, and then what you do is that you get your taste data and you test it against the motile.

  • So using them the testing data, you give it to the motel when you get the prediction.

  • And, uh, if you get the right label that you're expecting, then it means that it's fine if you want to look into this.

  • The documentation for tensorflow Jess is actually really good.

  • Um, a few limits very quickly.

  • So you need a large amount of data if you're not using a pre trained model.

  • Um you know, as we talked about before for a computer to understand what something is usually needs millions off samples.

  • It can take a lot of time to train your own model, depending on what you're trying to do.

  • Especially if you're, I think, dealing with images.

  • It can take a lot of time.

  • I remember to think about the mobile experience because when I was playing with loading, like importing a pre train model, some models can be a few megabytes, and that kind of hurts.

  • Um, so if you're on like a low network is gonna be really impossible, And even in terms of using the battery of the mobile device, it can be quite expensive.

  • So just, you know, think about that if you tried to develop something.

  • Liabilities.

  • Sometimes when you use ah machinery models, you actually don't understand what's going on in the background.

  • So if you decided to use machine learning to build an application, who decides who gets alone or who gets into what expensive school?

  • Whatever.

  • You have to be sure that if you tell somebody that they can't get a loan or that they can get accepted and they sue you you can't just say, Oh, we use machine learning, so I don't know.

  • Um, you have to just be able to explain what happened, and most of the time, people don't know what I definitely don't know.

  • And then finally and really opponent Lee, even if you're not interested in coding machine learning the topic of bias and ethics in a I is, like, super interesting.

  • Um, and I just like to mention often the algorithms are not biased.

  • We are.

  • I feel like we love to push the responsibility on a piece of code and not just like, look at what we do wrong.

  • Um, so remember algorithms is just code.

  • Um, And the reason why we say it's biased is because of the data that we give it, and the diet of the data that we give it is something that we produced.

  • So just a quick example Here.

  • This was already a few years ago, Um, so they hope they fixed it now.

  • But if you if you use before Google translate with some languages, are gender neutral.

  • And if you translate something which is it to English, then all of a sudden it becomes gendered where you have things like, Hey is hard working and she's a lazy, like all of a sudden it's gendered.

  • I'm not lazy.

  • I mean, you know what?

  • She's a cook.

  • He's an engineer.

  • That's the worst.

  • Um, so it is that kind of stuff.

  • It's like and then you had articles around.

  • Oh, you know, a Google translate is sexist.

  • Well, what do you think?

  • It's the data from, um, so I think it's just remember that I do think that, uh, machining can help us recognize some of our biases, and then help us You will be better, but the algorithm itself is not the problem.

  • Uh, okay, I'm getting to the end of my talk.

  • So, um, the story about that one Is that so?

  • I gave this talk.

  • Um, I'm just going to the unicorn.

  • Eso I gave this talk a few weeks ago in Lithuania, and if you like, just the day before or a few days before I had finished the prototype that I was building that I was, like, super excited about that involved machine learning and javascript And I wanted to show the world.

  • But I was like Shorty.

  • Well, now, for some people.

  • It's not a surprise anymore.

  • But I thought that if I didn't show it here, it would be unfair.

  • So I'm gonna show it as well.

  • And, um and your visitor is supposed to be a topic for a talk later this year, but it is related because it uses machinery, so I hope is gonna work.

  • So the inspiration for this is so this is a prototype built by a developer called Cordish who used the Web cam feed to train their algorithm to recognize some gestures to then apply it to your game, too.

  • So to be able to have this personalized experience off of playing a game The thing I was thinking, though, is that well, you have to be in front of the Webcam.

  • So what if our lives it in its dark?

  • You can't play anymore.

  • So there was a bit of a shame I was at home.

  • I think I can do something about it.

  • Uh, so I'm gonna try and remember what I have to do.

  • I think I'll have it ready.

  • Let me think I'm going to have some order.

  • If you don't mind, Right, Hopefully is gonna work.

  • So reaching said that we could use the unicorn.

  • So I'm gonna use the It's me and you now, unicorn.

  • They can only be one a rainbow on this stage.

  • Let me try something.

  • I'm the one on the round in red.

  • It's looking good.

  • It didn't work.

  • Okay, wait.

  • It's I died.

  • You know, I just like Mr That is right.

  • I don't have the music.

  • Okay?

  • No.

  • Okay, I give it one last try.

  • This is what happens when you cheated for the next two hours.

  • Bullpen seal.

  • Who's using a pencil?

  • You ruined my demo.

  • All right.

  • One last time.

  • No, I will not get off this stage until it works.

  • I'm sorry about the speaker after May.

  • All right.

  • I swear.

  • Okay.

  • Is somebody using retreat?

  • And they're like that was working.

  • Don't you?

  • Don't.

  • All right, Let me just way to end.

  • No.

  • All right.

  • Look at this.

  • All right, don't die.

  • Don't die.

  • Don't die.

  • Don't die.

  • Yes, this was like one of my worst life ever.

  • All right?

  • So just very, very briefly.

  • Ah, I wish I was just like Yeah, well, magic.

  • Not so much.

  • I I am, you know, in the process of writing like a post about how that works and stuff like that.

  • I don't know if you want to know it out because, um so just quickly a few resources I can share the slides on my later.

  • But here's a few tools and useful wings and I'm way over time data sets and, well, thank you so much for being with me this one.

good morning.

字幕與單字

單字即點即查 點擊單字可以查詢單字解釋

A2 初級

前端開發的機器學習,查理-傑拉德|JSConf.Asia 2019年。 (Machine Learning for Front-End Developers by Charlie Gerard | JSConf.Asia 2019)

  • 0 0
    林宜悉 發佈於 2021 年 01 月 14 日
影片單字