Placeholder Image

字幕列表 影片播放

  • YUFENG GUO: Keeras?

  • Kaaris?

  • Kras?

  • Keras?

  • Carrots?

  • Keras.

  • What is Keras, and how can you use

  • it to get started creating your own machine learning models?

  • Stay tuned to find out.

  • [ELECTRONIC BEEPING]

  • Welcome to Cloud AI Adventures, where

  • we explore the art, science, and tools of machine learning.

  • My name is Yufeng Guo.

  • And on this episode of AI Adventures,

  • I'll show you how to get started with Keras in the quickest way

  • possible.

  • It's never been easier to get started with Keras.

  • Not only is Keras built into TensorFlow

  • via tensorflow.keras, you also don't even

  • have to install or configure anything

  • if you use a tool like Kaggle Kernels.

  • All you need to do is create your Kaggle account if needed

  • and sign in.

  • Then you have access to all that Keras has to offer.

  • Keras also exists as a standalone library,

  • but the TensorFlow version has the exact same APIs

  • and some extra features.

  • Let's head over to my Kaggle Kernel,

  • where I'll show you how to get started using Keras right now.

  • In a previous episode, we did some machine learning

  • on the dataset Fashion-MNST.

  • It's a dataset of 10 different types of fashionable items,

  • from pants and shirts to shoes and handbags,

  • all presented in 28 by 28 pixel grayscale.

  • Mmm, grayscale.

  • Today, we'll do a similar analysis using Keras.

  • So to use Keras, we'll just import TensorFlow like usual.

  • These imports are actually identical to what

  • we had before.

  • And we'll pull in numpy, and pandas, and natplotlib

  • just as we normally would.

  • And we continue operating kind of in the usual way.

  • We'll pull in our training and test CSVs

  • and load them up in pandas and take a look

  • at what they look like.

  • We've got our Label column on the far left

  • with numbers from zero through nine.

  • And we have our Pixels--

  • pixel 1, 2, 3, all the way up to pixel 784.

  • I've got a function here to preprocess

  • our data a little bit.

  • It's similar to what we had before.

  • I've just cleaned it up and made it a little more concise.

  • I'm pulling out the features and dividing by 255,

  • so we normalize all the grayscale values

  • to be between zero and one.

  • And I'll pull out the labels as well

  • and have them both be represented as an numpy arrays.

  • We use that function to pull out our training and test

  • data from the data frame and associate them

  • to explicit variables-- train_features

  • and train_labels, test_features and test_labels.

  • And we can see that the final shape of these variables

  • are exactly as we would expect--

  • 60,000 examples with 784 columns.

  • And then our labels are just the 60,000 values.

  • And we're going to take a peek at one of them.

  • This is our 20th training_feature set.

  • And some pixels in the middle, we

  • can see that they're indeed values between zero and one.

  • And we can also visualize it.

  • Here we have a shirt, and we can see that it looks exactly

  • as you'd expect--

  • kind of grainy and grayscale.

  • Now, with Keras in this particular case,

  • we're going to need to one-hot hot encode our data.

  • And what that means is we're going to take our training

  • labels, which used to be just values like zero, three, seven,

  • and turn them into--

  • each of them-- into an array of length 10.

  • All 10 values in the array will be zeros except for one value.

  • That one value will be a 1.

  • And so that's why it's called one-hot encoding.

  • Now, where is that one located?

  • It's going to be exactly the number that it came from.

  • So, for example, if the value was seven,

  • the seventh zero will be a one.

  • If the number was four, then the fourth zero will be a one--

  • hence one-hot encoding.

  • And so we'll run Keras wtils.to_categorical,

  • which is a handy utility function that will just do this

  • for us.

  • And we'll observe that the train labels have now

  • turned from 60,000 rows of numbers

  • to 60,000 rows with 10 columns.

  • And we can see that indeed, in that same example label that we

  • saw before now, the zeroth index has a 1,

  • and everything else remains a zero.

  • And now comes the really fun part of working with Keras--

  • creating our model.

  • Keras supplies a really easy and intuitive way

  • to build up your model from the ground up.

  • In this case, we're going to make a sequential model

  • and add layers on top of it.

  • The first letter we'll have has 30 nodes

  • and has an activation function of a rectified linear unit

  • for relu, which in the case of TensorFlow that we used before,

  • was the default activation function.

  • Then we'll have another fully-connected layer

  • or dense layer with 20 neurons, this time

  • also with a relu function.

  • And finally, we'll do our final mapping to the 10 output

  • values of zero through nine and have an activation

  • of the softmax, which basically just distributes

  • power probabilities across the 10 buckets.

  • And now we're ready to compile our model.

  • Keras uses this notation of compiling

  • a model as similar to when you do something

  • like string builder or something to just say, I'm done.

  • Put it all together for me.

  • And we'll supply a loss, optimizer, and metrics

  • for what kind of values we want to get out of it, for how

  • to optimize for the best values, as well as

  • how we want to measure loss.

  • In this case, we're using categorical cross entropy

  • because our outputs are categorical.

  • And cross entropy, in this case, happens to be a nice way

  • to measure our loss or error.

  • With our model created, we're ready to run training.

  • Training with Keras is as easy as calling .fit.

  • When we call .fit, all we need to supply are

  • the training_features and training_labels.

  • It's also a good idea to supply epochs and batch_size

  • so that we can control the training a little more.

  • In this case, we have supplied an epoch

  • of two, which means we'll go through the entire dataset

  • twice over.

  • And we'll supply a batch_size of 128.

  • This means that with each training step,

  • the model will see 128 examples which will help guide

  • it to adjust its parameters.

  • And so we can see here Keras has some really useful helpless

  • as the training happens and gives us

  • a sense of the progress.

  • It also then prints out the loss and accuracy

  • at the end of each epoch.

  • But seeing the accuracy at the end loss at the end of training

  • isn't nearly as useful as evaluation.

  • We need to see the accuracy against our actual test

  • dataset.

  • So let's call our model.evaluate function

  • and this time pass in our test_features and test_labels.

  • This will give us an accuracy, and we can

  • print that out and take a look.

  • We can see we got 84.7% accuracy.

  • And of course, we could certainly

  • do better than that with increased epochs,

  • a more sophisticated model, and other approaches.

  • But this is just an intro to Keras.

  • And hopefully, this will give you a good starting point

  • to start playing around with Keras

  • and seeing all that Keras can do.

  • Keras has an amazing community and lots

  • of samples which, when you combine

  • with Kaggle's community, gives you

  • a truly epic set of resources to get you started the right way.

  • Thanks for watching this episode of Cloud AI Adventures.

  • And if you enjoyed it, please like it

  • and be sure to subscribe to get all the latest episodes right

  • when they come out.

  • Now, what are you waiting for?

  • Head on over to Kaggle and start playing around with Keras

  • today.

YUFENG GUO: Keeras?

字幕與單字

單字即點即查 點擊單字可以查詢單字解釋

B1 中級 美國腔

神經網路

  • 8 0
    林冠勳 發佈於 2020 年 09 月 17 日
影片單字