字幕列表 影片播放 列印英文字幕 ♪ (music) ♪ Hi everybody. Exciting event so far, and it's great to see all the questions that have been asked on social media. Please keep asking them and putting #TensorFlow. We're here today to answer them live on the live stream. - I'm Laurence Moroney. - And I'm Paige Bailey, and remember it is #AskTensorFlow. #AskTensorFlow. Because we're the TensorFlow that you shall be asking. Exactly, so we're here to try and answer as many questions as we can. Sorry if we don't get to them all but we'll do our best. So shall we take the first question? Absolutely, let's go for it. So, one of the first questions that we're going to cover, and it's one question I bet you've had it today, I get it almost every time I meet a TensorFlower, is it's great to be able to do training and like you'll usually do my training for a fixed number of epochs, but what happens when I reach my desired accuracy metric, how do I cancel training? Right, like it doesn't make any sense to keep using Compute if you've already gotten to a point where your boss would say, "Okay, cool, 99% accuracy-- that's fine for us today." (Laurence) So shall we take a look at how we'd actually do that? Absolutely, let's go for it. So, I've opened up a colab here that you can see where I'm using callbacks, and callbacks are the way that you would actually achieve this. So at the top of my colab here, you can see I have class myCallback, and on this one then, when an epoch ends in training, I'm able to take a look at the logs, and if the accuracy log, for example, in this case, is greater than 60%-- - 'cause I have a really nice boss, - (Paige laughs) (Laurence) he's really happy when my training's 60% accurate-- then I would actually cancel the training. And then to be able to set that up, I'll just say I'm going to create an object called callbacks, which is an instance of my callback class, and I love the way in Colab when I Double Click, it actually highlights, it's just a little thing that I like. And then down here on callbacks, I'll just say callbacks equals callbacks. And then when I actually do the training, and you know what, I'm going to really show off and I'm going to make my runtime type to be GPU so it goes nice and fast. So this is Fashion-MNIST. Let's do a little bit of training on Fashion-MNIST with this one. And we're doing this live so I'm connecting up to the VM. And here we go, now it's actually training. It's getting ready to start. - It's on the first epoch. - (Paige) It's showing you your RAM and disk utilization. (Laurence) We're on the first epoch. The first epoch is progressing away, 60,000 images being trained, and boom, I hit accuracy of 83%. - (Paige) That's pretty good. - In just one epoch, right? But we can see now that it actually reached 60% accuracy so it canceled the training. So callbacks are your friends if you're doing this. Certainly when you're learning, when you're experimenting, I used to, before I learned about callbacks-- I keep saying colabs-- before I learned about callbacks, I would set something up to train for a hundred epochs and then go and go to sleep, and then wake up the next morning and find like after three epochs, it had done its job, and I'd wasted my time. So, use callbacks, I think, would be the answer to that. Absolutely, keras callbacks are incredibly valuable. It doesn't just apply to accuracy either. There are a bunch of additional metrics that could be useful for your particular workflow. And this code also would work in TensorFlow 2.0. - So it's keras-- - (Laurence) Absolutely. - (Paige) Gosh I love keras-- - (Laurence) Keras, yes. This is a keras love affair right now. And actually one of the really neat things about keras that you may not realize, and we've been talking about TensorFlow 2.0, is that the same code that you write for TensorFlow 1.x is the same code for 2.0 but what's going on behind the scenes is that it's executing equally in 2.0, instead of a graph mode. So even though this colab I think I'm running at 1.13, this code will actually still run in 2.0, without you modifying the code. (Paige) Absolutely. Alright, so shall we take the next question? Oh, cool. So, our next question is from Twitter it looks like, and "What about all the web developers who are new to AI, does the version 2.0 help them get started?" Oh, web developers. They are some-- oh, man. So I just got finished talking to two of the folks on the TensorFlow.js-- and you just pulled it up-- the TensorFlow.js team, about all of the cool new demos that they've seen arise from the community. It's really such a vibrant ecosystem of artists and creators that are using browser-based or even server-based tools now, to create these machine learning models, training and running them. Yep, so I think for web developers there's a whole bunch of ways that you can get started with this. So you've mentioned TensorFlow.js so let's talk about that for a little bit first. The TensorFlow.js, it's a JavaScript library and this JavaScript library will allow you to train models in the browser, as well as executing them. And that actually blew my mind when I first heard about it. The node bindings, like being able to use the GPU inside your laptop with Google Chrome or your favorite flavor of browser to train a model. That is absurd! - (Laurence) Sci-fi now, right? - Yeah, it is. We live in the future. So like you said, node bindings as well, so like with Node.js so it's not just in browser JavaScript, it's also server side JavaScript with Node, right? And am I supposed to say Node or Node.js or-- (Paige) I don't know. I'll say Node. And then of course, there's, by the fact that it is in Node, one of my personal favorites, are you familiar with Cloud Functions for Firebase? I'm not. Tell me more. I'm intrigued. So, I used to work on the Firebase team, so a shout out to all my friends at Firebase. - Alright, I'm leaving. - (laughs) No, I'm just kidding. I've heard so many good things about Firebase. So it's for mobile developers and for web developers. And one of the things that Firebase gives you are these things called Cloud Functions for Firebase. I've called up the webpage here with the URL. But in summary, what they do is that they allow you to execute functions on a backend without you needing to maintain a server infrastructure, and allow you to execute these in response to a trigger. So a trigger might be, for example, an analytic event or a signin event. (Paige) Or you get new data, and you need to process it. - (Laurence) Bingo! - (Paige) Man, I should try this out for machine learning stuff. (Laurence) So now, the fact that they run JavaScript code Node on the backend, now it's a case of you can actually train models in a Cloud Function, which just-- for me. That's amazing. So, web developers, there's lots of great options for you, however it is you want to do it, in the browser, on the backend, in mobile, that kind of stuff, hopefully there's lots of great stuff that you'll be able to get started with. Absolutely, and the question about TensorFlow 2.0 and whether it gives additional tools for application developers, I think it would mostly be in terms of those codes and tutorials that we were mentioning before. We've also released some courses, so it's easier than ever to get started. And the models that you create using saved model can be deployed to TFLite, to TensorFlow.js, to whatever. And the important thing is we've been talking a lot about Keras-- this thing that we love-- and the keras layers are supported in TensorFlow.js, so it's not just for Python developers. If you're a JS developer you can define your layers. And an R developer. They have Keras for R which is awesome. It was created by J.J. Allaire and Francois Chollet, they have a book out about it. (Laurence) Nice, cool. So web developers, lots of options for you. - Yep. - Right, yep. Shall we take the next question? And this looks like it also came from Twitter, and it's "Are there any TensorFlow.js transfer learning examples for object detection?" So TensorFlow.js is popular, we have learned. Yes, so object detection. So, how do we answer this one? So, it depends on what you mean by object detection because in Google we talk about object detection. We use that specific term for in an image where you got lots of obejcts and you put bounding boxes around them, right? Right now there are no samples for that. - Sorry. - (Paige) No there aren't, but and lovely thing is that the community is incredibly adept at creating TensorFlow.js examples. So example, the Teropa's CodePens. And then also Victor Dibia, machine learning Google developer expert, had a great recent example with using it to track hand movements in a browser. So, the question there was really about transfer learning, and I think one of the things that even though we don't have a demo of transfer learning or object detection, I'd like to show a demo of transfer learning with being able to detect a single item within a frame. So, we call that an image classification. So, can I roll the demo? Please do. - Are you going to do your favorite? - I'm going to do Pac-Man. - Oh, you are-- I should've known. - I'm a child of the '80s; I love Pac-Man. If you look carefully it says, actually, "loading mobile net now." So what's happening is that just downloaded the Mobilenet model. So what I'm going to do is I'm going to add some new classes to the Mobilenet model and then use transfer learning to get them. (Paige) Will it see my finger-- there it goes. Do you want to do it? (Paige) No, no, no. Go for it, show me how. So, Pac-Man-- old arcade game-- you move up, down, left, and right, and you try and run away from the ghost. So I'm going to try and train it to move up when I point up like this. So I'm holding it down and I'm gathering a bunch of samples. There're about 50 samples, and then when I go right like this, I didn't really think this one through, though, 'cause then turning left is going to be hard. But, bear with me, and like 15. Maybe I'll do left like this and get my head out of the way. (Paige laughs) A few samples like that, and then down will look like this. Hopefully, these aren't rude gestures in some country. And something like that. So I have now picked like 50 samples of these, and I'm going to retrain this mode in the browser with transfer learning. So if you look over on the left here, my learning rate, my batch size, I'm just going to train it for 20 epochs. And I'm going to start training and we'll see, it starts going quickly. You see my loss started at four and then went down-- now it's at zero. So, that's like wow, it's probably a digit beyond the six digits, it's never actually at zero, but we see we have a very low loss. So now we can actually give it a try. So let me see if I can avoid getting eaten by ghosts. So, I'm going to start playing the game, and I'm going to move left, and you can see the bounding box around it, kind of shows that-- Up! Up. No, okay. Up! Right! - No, go right. - (Paige) Oh no! (Laurence) I'm watching the screen instead of watching Pac-Man but we can see now that I've actually trained it. Let's try going right this time. Come on, right, there we go. And up. It thinks it's down. There we go, up, and right. Ah! We see, I'm not very good at this game. I wasn't even good at it with the joystick. (Paige) You're just using this as an excuse to play Pac-Man all day, - aren't you? - Exactly! But there is just a great example of using transfer learning. So if you can take this sample apart, and we have some other samples that are out there for transfer learning in JavaScript, so you can just see how easy it was for us to be able to extract the features from Mobilenet and then retrain it. It's actually moving as I'm talking, - (laughs) - all my gestures to be able to use that. So, enough on Pac-Man, shall we move to the next question? (Paige) Absolutely. So, and I'd also want to point out that transfer learning can be used for a variety of use cases other than just images too. So make sure to check out all of the great examples that we've got listed on the website. Sounds good. And the next one also looks like it was from Twitter-- Twitter must be very popular. - I like Twitter. - I love Twitter. Are you going to publish the updated version of TensorFlow for Poets tutorial, from Pete Warden. implementing TF 2.0, TFLite 2.0, and a lot of other shenanigans. Yeah, the neural network API. Faster inference on Android. Yeah, and I love Pete Warden's codelab on TensorFlow for Poets. He also had a great talk today. Oh, I didn't get to see it. Do you want to take this question? Sure, so the TensorFlow for Poets codelab, at some point we will update it. I don't think there's an updated version available right now. But one of the things that I really liked about the TensorFlow for Poets codelab was it got me building a model very quickly that I could then use on a mobile device. But the drawback of that was it was a bunch of scripts that I ran and I didn't really know what was going on with them. So one of the things that we've been doing is that we've decided to get a whole bunch of new TensorFlow Lite examples and put them online on the site. And I have them on here. So there's four new ones-- gesture recognition, image classification, object detection, and speech recognition. And what's nice about these is they're all open sourced, they're both Android and iOS, and they include full instructions on how to build them for yourself. The image classification one is really fun. I'm actually going to try to run that in my-- whoops, I don't want Bitly-- I actually want to try and run that in my Android emulator. So we can see it running in an emulated environment. So let me get that started. Oh, we can see it being cached. So, for example, now here I'm actually running it. It's doing a classification of what's in the background. Like, if I hold up a water bottle-- whoops, this way. There we go, see, it actually detects it's a water bottle. Now this is running in the Android emulator. This is using TensorFlow Lite, and this is the sample that's on there that basically does the same thing that you would have seen in TensorFlow for Poets where it's using Mobilenet and building an application around Mobilenet. But if you look, even running in the emulator, I'm getting inference times in the 100, 170 milliseconds. - (Paige) It's so fast. - (Laurence) How cool is that, right? (Paige) The ability to be able to take large-scale models and pull them down to a manageable size on a mobile or an embedded device, is huge. I'm really excited to see what TensorFlow Lite does this year. Yep, so we're working on a bunch of new tutorials, those samples are out there. If you take a look at their GitHub page, you'll see that there's details on how it's built. Let me just go back on here. So, for example, if I say "Explore on Android," you'll see there's details on how it's built, how you can put it all together, how you can compile it. You don't need to build TensorFlow in order to use TensorFlow Lite-- that was one bit of confusion that folks had in the past. Now, it's just a case of what you add to your build.gradle, or if you're an iOS developer, the pods that you add, that kind of stuff. So you can go and start kicking the tires on these applications for yourself. (Paige) Excellent. (Laurence) Alrighty. But we will have more codelabs and I would love to get Pete's TensorFlow for Poets codelab updated, hopefully for IO. Yes. ♪ (music) ♪
A2 初級 在訓練中使用回調,在TF 2.0中入門,以及更多!(#AskTensorFlow)(#AskTensorFlow) (Using callbacks in training, getting started in TF 2.0, & more! (#AskTensorFlow)) 3 0 林宜悉 發佈於 2021 年 01 月 14 日 更多分享 分享 收藏 回報 影片單字