Placeholder Image

字幕列表 影片播放

  • ♪ (music) ♪

  • Welcome back.

  • - I'm Paige Bailey, and this is... - Laurence Moroney,

  • and we are here to answer all of your Ask TensorFlow questions

  • at the TensorFlow Dev Summit.

  • So, if you have any questions, please submit them to social media

  • with the hashtag #AskTensorFlow.

  • And we'll answer as many of them as we can,

  • but any that we can't get to today we'll try to reach out to you later

  • - to answer them. - Absolutely.

  • So let's get started with the first question.

  • - Okay, so shall I try this one? - Yeah, absolutely.

  • So this one, I think, came on Twitter, from @yuehgoh, and it said:

  • "Once I installed tensorflow-gpu, import did not work for me.

  • I have been trying to use it," but has been unable to do so.

  • "It fails to load native tensorflow runtime."

  • I remember seeing this one.

  • Actually, it was on YouTube,

  • and it was in response to one of my videos

  • about pip-installed TensorFlow GPU in Colab,

  • because once upon a time in Colab,

  • you had to pip install TensorFlow GPU to be able to use it,

  • and now if you try to do that, you end up having some issues.

  • The reason for that is actually really good, and it's good news,

  • and it's because you don't need to do it anymore.

  • Excellent!

  • So, actually, if we switch to Colab for a second on my laptop,

  • I can show you.

  • This was the notebook that I was showing earlier on.

  • And all you have to do, if you want to use GPU in Colab,

  • is just change your runtime type,

  • pick GPU as the hardware accelerator,

  • and now you don't need to pip install TensorFlow GPU;

  • it actually does it for you under the hood, behind the scenes.

  • It's really, really cool,

  • and that's why earlier I was able to train this so quickly,

  • because I was actually using the GPU.

  • And, as you can see, there's no pip install GPU on here.

  • (Paige) Excellent.

  • Whenever we were initially testing TensorFlow 2.0,

  • we had a kind of similar issue, as well, with the GPU install,

  • in that you needed specific CUDA drivers.

  • But now, CUDA 10 is supported in Colab, as well.

  • So Colab is a great experience if you're using a GPU

  • or if you're using any other accelerated hardware.

  • (Laurence) Yeah, and a pro tip going forward, as well,

  • if you want to do GPU stuff,

  • because this was something that I ran into a number of times

  • when trying to use the GPU,

  • was that you always have to carefully take a look

  • at the version of CUDA and cuDNN that you're using.

  • Because I made the mistake

  • that I just went to the vendor's website,

  • I downloaded the latest versions, I installed them,

  • and then I saw TensorFlow was actually supporting

  • a slightly earlier version.

  • So if you do get an error when you're trying to use GPU,

  • just take a look at the version

  • of the driver that it's looking to support,

  • and then, from the vendor's website,

  • download that specific version.

  • - (Paige) Yeah, driver issues... - Driver issues.

  • (Paige) They're always a treat, right?

  • Exactly!

  • It's one of the things that makes our job interesting.

  • Absolutely.

  • Alright, so shall we take a look at the next one?

  • Yeah, let's go-- Oh! @adkumar!

  • So @adkumar had at least eight excellent questions on Twitter.

  • Maybe he wins the volume award.

  • He absolutely does!

  • And we'll get to all of them,

  • not in this sort of Ask TensorFlow segment,

  • but let's focus on just one for today

  • and then answer the rest offline.

  • Yeah, and I think a lot of them were really about file formats

  • and how do I use different file formats,

  • so shall we drill into that?

  • Absolutely. So the question is:

  • "Which is the preferred format for saving the model going forward,"

  • saved_model or something else?

  • And, if we look at the laptop, we can take a gander

  • at one of the slides from the keynote this morning,

  • really showing that Keras is a first-class citizen

  • in TensorFlow 2.0

  • and SavedModel is at the heart of every deployment.

  • So here you can see SavedModel being used for TensorFlow Serving,

  • for TensorFlow Lite, for TensorFlow.js,

  • and lots of other language bindings,

  • so really, we're pushing for the SavedModel.

  • (Laurence) And if you focus on SavedModel you can't go wrong.

  • (Paige) Yes, absolutely.

  • It's a lot easier to use

  • than some of the other deployment options that we'd seen before.

  • Yeah, so I think the guidance would be in the recommendation,

  • not just for AD, but for everybody else.

  • And, when you're thinking about saving out your models,

  • take a look at SavedModel, consider using SavedModel,

  • and, as a result, it's not only is the advantage of the file format,

  • but just how it's supported across all of these things.

  • And an excellent point of TensorFlow 2.0--

  • I'm just going to keep selling it-- is that we have a number

  • of code samples and tutorials available today

  • about how you can deploy your models with SavedModel.

  • Yeah, and I've personally found,

  • from playing with some of the TensorFlow Lite stuff in 2.0,

  • saving as a SavedModel and then going through the conversion

  • to the TF Lite process, it was a lot easier for me

  • than in previous iterations

  • where I had to use TocoConverter and all that kind of stuff.

  • So it's really being refined. We're really iterating on that.

  • - And I think it looks really cool! - Excellent!

  • So thanks for all of those questions, AD. There's some great stuff in there.

  • We'll try to answer some of the rest of them,

  • but understood that most of them are focused

  • around the file format and hopefully SavedModel will help you.

  • - Alright. - Perfect, so let's go to the next one.

  • So this next one comes from Elie Gakuba,

  • asking, "Is it possible to run tensorboard on colabs?"

  • - And I know this made Paige really happy! - Ah, yes!!

  • Oh, dude! You are going to be so delighted!

  • Because before TensorBoard was running on Colabs,

  • we were talking about it: "We really want it on Colabs!"

  • It was so painful.

  • And if you wanted to get it working

  • in a Colab notebook or in Jupyter,

  • you ended up using a tool like Ngrok,

  • and that was kind of not approved by our bosses, or in general.

  • But yes, the good news is that you can run TensorBoard in Colabs.

  • (Laurence) And when it was first announced internally in Google,

  • before it was publically announced,

  • we all got this email from Paige,

  • and it was full of all these smiley emojis and hearts.

  • (Paige laughs)

  • So, Elie, thank you for the question,

  • because I think you've made Paige's day.

  • (Paige) Excellent! And so you can see here in the Colab,

  • I'm running through and downloading some files

  • in the hope that we could play with it a little bit,

  • but here you can see it actually working.

  • You should be able to do different operations like smoothing,

  • changing some of the values,

  • and then also using the embedding visualizer

  • directly from your Colab notebook

  • in order to understand your accuracies

  • and to be able to do model performance debugging.

  • Another nice thing that the team has been working very, very hard on

  • is that you don't have to specify ports.

  • So you don't have to remember

  • if you wanted to have multiple TensorBoard instances running,

  • that you were using, what is it, 6006, or whatever, for another.

  • It just automatically selects one that would be a good candidate

  • and creates it for you.

  • So the team is phenomenal.

  • If you have any interest whatsoever in TensorBoard at all,

  • I suggest stalking their PRs, like I do,

  • because that's how I found out

  • that TensorBoard got added to Jupyter notebooks and also to Colab.

  • But, yeah, so excited.

  • And we'll have this link in the documentation for the video,

  • as well, the little notes underneath, for you to go and play with.

  • And I do have to say, it's so great to have a PR stalker in our group.

  • (laughter)

  • I get push notifications to my phone-- It's a problem.

  • But yeah, they've been doing such great work.

  • (Laurence) So the question is yes, TensorBoard is working in Colab.

  • - And also Project Jupyter notebooks! - Nice.

  • Use it wherever! TensorBoard everywhere!

  • TensorBoard everywhere; we all love TensorBoard.

  • I haven't really played with it that much,

  • but do a lot of the plugins also work?

  • So TensorBoard really is a collection

  • of these different visualizations,

  • so you can see scalers, like your accuracy;

  • you can see histograms;

  • you can see embedding visualizers,

  • which allows you to do clustering,

  • like that great MNIST example

  • from the Dev Summit a couple years ago.

  • - With all that stuff. - Moving it around.

  • - I play with that all day. - It's beautiful.

  • And then also Beholder, which was created by a community member.

  • The plugins... It's so funny you mention it.

  • A couple of our GSOC, our Google Summer of Code projects

  • this summer are focused on getting

  • additional visualization plugins added to TensorBoard.

  • (Laurence) Cool! Nice!

  • So in addition to the What-If Tool, in addition to Beholder,

  • you could make your own TensorBoard plugin.

  • While I could geek out about this all day, I think we should move on

  • to some other questions that some folks have had.

  • Can't we kick out about this all day?

  • Can we extend the stream for three or four hours?

  • So the next question that came in, from Amirhosein Herandy,

  • "How would you use feature_columns with Keras?"

  • I know you know the answer to this question.

  • But I know that it's a particular passion of yours

  • that you're working with Estimators in Keras.

  • So, for the folks who are watching,

  • feature columns are really part of Estimators.

  • They're a way of really getting your data

  • efficiently into Estimators.

  • And with people saying, "Hey, it's all there in Estimators.

  • What about Keras?"

  • You've been working on some great stuff around that, right?

  • I have, but I know that your YouTube channel

  • has a series from Karmel, who spoke earlier today.

  • Yeah, so Karmel is our engineering director

  • for high-level APIs, and she has this great series

  • around high-level APIs for TensorFlow,

  • really, really teaching you how to use the high-level APIs.

  • And Karmel is working actively, and her team are working actively,

  • for parity of things like feature columns in TensorFlow 2.0.

  • I'm not sure if it's fully there yet in the Alpha.

  • I haven't checked into it yet.

  • But, yeah, it's on the way, if it's not there already.

  • So you should be able to use them in Keras.

  • (Paige) Yes, and we're also in the process of building out--

  • If you wanted to migrate your models from using Estimators

  • to being more of a TensorFlow 2.0 format

  • with Keras,

  • I am currently in the process of building a migration guide.

  • So if you have any interest around that,

  • please feel free to reach out.

  • And we're excited to get that released pretty soon.

  • I'm really excited to see that, personally,

  • because the very first stuff I did in TensorFlow

  • was with Estimators, before I learned Keras,

  • and I really want to go back and start changing them to use Keras

  • without rewriting them.

  • Absolutely.

  • Keras is just friendlier to work with, my feel.

  • Yeah, so they're both great.

  • Estimators really gives you the power. Keras, great for beginners.

  • So hopefully we'll get the best of both worlds.

  • Shall we take a look at the next question?

  • Yes!

  • Jeff Thomas asks, "looking for some simple data sets

  • for testing and comparing different training methods."

  • Aha! Different! "Looking for new data sets."

  • MNIST is great. Fashion-MNIST is great.

  • But, after a while,

  • people want something new and fresh, right?

  • Yes.

  • What do you think we can say to Jeff?

  • We could tell them about TensorFlow Datasets!

  • And there's a great blog post right here about it.

  • Yes, it is.

  • And TensorFlow Datasets is really about creating

  • those data-ingestion pipelines

  • for you to be able to easily use a variety of datasets

  • with all of your deep learning and machine learning models

  • with just a few lines of code.

  • So, if you're familiar with Scikit-learn

  • and all of its nifty data-ingestion practices,

  • this feels very similar.

  • It's very easy to do training and testing splits

  • and verification [splits],

  • and we have a lot of datasets readily available right now

  • for you to go and explore.

  • Another thing that I would especially like to call out

  • is a member of our community--

  • So anybody can make your data famous.

  • If you have a dataset that you're using in your research lab,

  • if you have a bright and shiny CSV

  • that you think would be a cool add to the structured section--

  • I've never heard a CSV called bright and shiny before.

  • Well, you know... Everybody uses them.

  • One of our community members, Andrew Kondrich,

  • who's an undergrad researcher at Stanford,

  • he added this CheXpert Dataset from his lab--

  • 200,000 chest radiograms, from the Stanford ML group.

  • And he was able to do it in less than a day.

  • It really is just as simple as-- take the template format

  • for images or for audio or whatever you're using,

  • add some additional metadata for the dataset,

  • change, potentially, the types

  • for some of the features that you're using with it,

  • - and, voila! there you go. - You're off to the races!

  • Absolutely, and you can make your data famous!

  • And one of the really important things about it,

  • particularly if you're getting started or if you're learning,

  • is that if you take a look at a lot of the samples

  • pre-TensorFlow Datasets.

  • There tends to be lots and lots of code

  • about download your data, unzip it here, label it like this,

  • put these files in these folders,

  • or take your CSV and make these features,

  • but when it's in TFDS, it's like one or two lines of code,

  • and the data will just get sorted

  • into training and test sets for you, that type of thing.

  • So I think for learners, in particular,

  • I found it really, really exciting

  • because I didn't have to go through 100 lines of code

  • before I got to the [neural] network.

  • And also data science-y people.

  • So much of creating and training a model

  • is understanding the shape and the format

  • and the statistical distributions, and this is really helpful.

  • (Laurence) Yeah. So thanks very much.

  • ♪ (music) ♪

♪ (music) ♪

字幕與單字

單字即點即查 點擊單字可以查詢單字解釋

A2 初級

在TensorFlow中使用GPU,在筆記本中使用TensorBoard,尋找新的數據集,等等(#AskTensorFlow) (Using GPUs in TensorFlow, TensorBoard in notebooks, finding new datasets, & more! (#AskTensorFlow))

  • 2 0
    林宜悉 發佈於 2021 年 01 月 14 日
影片單字