B1 中級 5211 分類 收藏
hi i'm def miller and i'm going to show you how easy it is to make a classic
back propagation girl net simulator in c plus plus
this tutorial is for the beginning to intermediate c_-plus plus programmer
if you can write mbe by the simple consul program with a simple classes
than you can certainly make a neural net and c plus plus
in addition to showing how the neural networks will talk about some pizza cost
plus programming basic concepts such as class assign prototyping data hiding
encapsulation and things like that
the tutorial does not cover required any prior experience with the more advanced
topics like exception handling or class inheritance or threats
uh... so this will be a simple program and you can download the resulting
source code at my block which you can reach from our website
neural net at the highest level of edas abstract back
black box level is just so super simple all it does this eat you put numbers in
its inputs
any good numbers on its output
so it's just
a mathematical transform what's the big deal about the romance
well that and the big deal is that sometimes you want to get from your
inputs to your outputs but you're not sure or
the formula as
and you're not even sure how to mathematically derived creek formula
but if you have a lot of real world training data worked you know what the
output should be given certain influence
then you can throw all that training data at your neural net id if it's
successful it will
learned what the transformants
then it will give you a reasonable result even on the inputs that is never
seen before
now if you look at the next level of neural nets uh... look inside the black
box to see what the next level of detail it is
it's it's pretty clear that under on that contains just a bunch of these
neurons that ur tightly connected
and there are ranged in these columns movement of the called layers
so there's always a input layer of neurons that accepts the input numbers
there's always a layer of output neurons in this case just one
but how for many in the you need for the problem yourself
you have a layer of output neurons than in their outputs become the output of
the entire neural net
but in the middle in between the output in
uh... input
layers you have one or more hidden layers of neurons
and they're all connected so the teacher are disconnected fully to the
the neurons in the next layer to the right
eve could have uh... neural nets that are sparsely connected or have
connections that feedback and earlier layers
uh... that would give you a neural net a little bit of memory if you do that
but for this tutorial were going to keep things simple and just assume that are
in their own that will be fully connected and implicitly connected
and forward connected
so that if you're on fully connects all of the neurons in the next layer to the
right and those are the only connection suite
would manage
uh... neurons common different varieties depending on where they are um...
they're all pretty much the same but there's a little bit different role that
they play depending where they lived in her on that
for example the input neurons
i don't really do any processing they're they're just
hold the implica ideas to the whole neural net
if this were an electronic circuit those would be like elections so you'll
actually complex there
into the input layer
so an input neuron all it needs to do is hold it
input value on its output side so that that can then become the inputs to the
next letter
the hidden players in the output layer neurons they all do a very small simple
amount of processing
when the inputs change
what they'll do is is they'll go through and sum up all of the input values
in each input connection has a way to its older waited inflicts
so takes all the we didn't put some some up runs perot's simple little formula to
shake the output
and that becomes its new output
which then
becomes again puts them to the next later
so want to change the inputs the net trickles through and the the values and
all the outputs change until the output later changes that you answer
so it's the weights of the connections actually form the the the mathematical
transform when you train the neural nets do something
it's the changing all of its internal waits until the following solves the
problem or not
there's one more
i want other kind of neuron that we need to uh...
uh... including that's a bias neurons
these neurons depending on the problems of solving
the the neurons sometimes will work better if they have a constant bias
input in addition to the other inputs
and so what we can do to implement that and in in this model that were making is
will put one extra input
on all of the hidden layers an outlier neurons
and what i had one extra neurone of each layer to call the bias neuron
and it it's fully that connected
but they have no inputs
their output is set to a constant one point zero all the time so they
all there was a biased iran's provide a constant one point zero value output
all the time
there with the way to the connections change just like a normal meron but
thereof puts her are constant
you can start seeing a little bit about how the program might end up looking
will probably have one class what we'll call it uh... net class meta
for the holder on that
and probably under ron has a nice little chunk of
data indicate here that we could encapsulated in a class
the connections won't need to store somewhere in those are just
a bunch of weights
well lesser breaking down some of these ideas on on paper here
um... or on-screen what we're thinking of them and will do it
does in the form of a c_-plus plus program so fire up your favorite i_t_t_
or you favor text editor list are making
a c_-plus plus program i'm calling mine here in the room that tutorial betsy p p
and i'm using the eclipse i_t_t_ on atlantic system but
uh... i'm going to do my best to uh...
show you a solution here again n_c_i_ so standard c_-plus plus so that
you can compile it anywhere you find a standard compiler
we know they're going to have a class nap
and we know that uh... our class that will be a typical class with the public
section and a private section
this is brought to a great start but before
filling in any more details
wharton class net
here's what i suggest test early contest often
so in this case what will do for a test for him to dislike the main
body down here
and in the main program
to use dark last nap we would like to instantiated without a whole lot hassle
we'd like to be able to say something just as simple as
uh... you know the classname snapped
and the object we want instantiate let's call a mine apt
and the uh... constructor will need to take
just a minimal amount of information
it needs to know the number of players that you want a mineral math and the
number of neurons per player
will pass it will call that apology it'll be some sort of structure her
classroom object of some kind
uh... we'll figure out that exact type later but it'll be disassembled little
little object that would pass in there to the constructor
and that should be all it takes to construct
uh... neural net
and that's a trainer
we need to be able to uh...
column number function on it
that feeds forward a bunch of input values
and so would the skull that feedforward
that's what the textbooks call that kind of operation for neural nets silly is
that same terminology here
and and we know that will have to make some sort of in the radar structure
something that holds those input values and so
i was called an object in two thousand people
little de klerk's type
and then during
after feeding for too much inputs we need to tell the neural network the
outputs were supposed to have been so that it can then go through and goats
back propagation
uh... learning
and we'll call that uh... back property
ten we know that will have to pass it a little
array were structure of the target output values
and we'll figure out is that type later
and then if
operating panorama normally after it's been trained
we need to norton were interested isn't they've been romance outputs and so we
need some sort of a way to get those back out and what will say that will
have a
an eight p i here called not get results and will pass it in some sort of a
container that they can fill in with the results and
that's how we could have results back
so during training will use those first two functions feedforward and backdrop
and we're not really interested in the results during a training
or disinterested in telling the net what its results should have been
and then during actual operation after training will use the feed forward in
the get results
uh... member functions over and over
so virtually will have to put this into a loop here
so that we could look through into the whole mess of training
uh... training samples and wore break those loop details later but at this
this is enough to show us that
that the public u_p_i_ for a class net
needs to have a few quarterback propagate results image that's probably
sufficient that's probably the entire
interface to class math i can't believe anything else it's going to need yet so
so let's go ahead and put those three member functions
up here along with uh...
we'll start with a instructor method takes us something called apology which
will won't forget the type later
and then the feedforward will return void it takes at the input files a
backup takes target thousand get results takes rizal pals
it's a good thing you know and when you don't see passport
plus programming to be aware of uh... const correctness
and uh... if you're not into const correctness yet if that's still a
confusing topic
then you can just ignore anything i say about const qualifiers into sleep from
all out in the program will still be around
but if you are interested in const correctness all try to point out some of
the places where
the consequence firewood go here
get results is a member function that just
reads output values of the output later and puts them in the container and sends
it back doesn't modifying the object of also
it's a constant member function and the concert go here
let's think now about the case
peace container types are click amperex feedforward where we sent the input
valves into feedforward
input files need to hold
a bunch of dolls and and it could be like it for a bowling for radio doubles
you might have a c_-plus plus compiler of the that does support agreeable like
the race and if you do you can use that and that that would work just fine here
but that's not n_c_i_ so standards so light
promised i'd show you something that would
compile in any standard compiler self their is eighty
their is a standard container that we can use that acts like a very polite to
ray who is really easy to use com factor
is part of the standard template library
and it needs at least one template parameter inside a single braces
specifying the type of the element and we will use doubles
so a vector of double
means that were going to use it as if there were like hitting a very boldly
parade we can even use the radio station on input files
to access the individual members
in this part of the standard template library so he needs the s_d_p_ colin
colin namespace qualifier in front of it
uh... so the compiler can't find it
and its allies also needs the header file appear
so that the compiler concedes declaration in the standard header
there's an alternative to that s_t_d_'s
colin colin mainstay specifier some people
like to see them
because when they see that
in the program that they can instantly remember all
you know defectors not one of my creation submit that comes from the c_
plus plus
uh... but other people think that it's just a bunch of clutter so if you think
it's cluttered and you don't want to see those as t_v_ names
there's an alternative look at the top of the program you can write
using namespace s_t_d_'s
semicolon edmund that applies to the whole file
now anytime the compiler confined uh... name somewhere in those two go look in
the as t_v_ namespace and that means at this as cd here is optional
you can put it there if you want or not
so that we know what input files container is it's a factor of double so
we could put
the vector double type appear to be class net decoration
will put an ampersand in front of input files to show that will
we're happy taking in input files by reference
there's no need to pass a whole array or structure uh... vector
byte value intercede for tickets enough just to pass the reference and so
we'll take the reference there
and because feedforward
all we need to do it is to read the values from input valves and transfer
those into the input neurons so
it doesn't change anything in the input vowels argument
so because of that it promises not to change the factory double so we can put
constant there in front of director doubles
says one thousand is a reference to affect earth double stick with promise
not to change hands for in this function
k off to a great start here we couldn't say that the
same kind of container place to target file so it's also target tells us that
reference to a container of factor doubles and it's also a constance
because wouldn't change
the contents of that
the result fouls
will be a vector of doubles but in that case it does not take the constant
because we are going to fill-in
numbers in in into the elements of of that becker doubles to return to the
however get results won't change the imperil objective also it all function
as a constant contatos in a different place there
on that one
so that we can come down here into the main part in
and finish putting in the yeah
definitions then up those containers
okay the next data item to figure out his typology
we'd like to be able to send in uh... a little array of of uh... integers or
right unsigned integers
and the number
that we send in is the number of players
from for instance gershwin taking if we send in the little rain values three two
and one
and that means we won three players in the net
the first layers the uh... the input there it would have three neurons than
one hidden layer of tumor nuance and one
under on in the opposite lair
and so this could just be a a factor of one side against like this
and that specifies everything then that the constructor needs to know to
construct the
the uh... neural net
we can
go ahead and and fill n
the datatype inferred apology appear on the nets constructor
and some second structure is not really
changing the elements anthropology will
promise that we will change it
by putting a const there
now we can take
what just
class and that need to do it it's constructor construction x
it needs to make ubuntu neurons and the neurons are are arranged sort of like in
the student mention or arrangement of
layers in and around spur later
so one way to arrange that would be to have a two-dimensional
ready of neurons
i found out some since i've written this program here before that and i'm not
writing it from scratch right now i've i've have had the opportunity to
that it will become convenient for us to speak of layers
uh... sometimes for some of the mathematical calculations we wanna when
you're going forward and iran needs to access
information in the next layer to the right
or if you're going back propagation huron will need to access information in
the neurons in the later to the left so layer is kind of econ lenient thing to
help in our programmers
as a datatype
so one way that we could do this as we can just say that
that all of the neurons will be arranged
in laters and and
all the nafta needs to do is it is to have an array of layers agreeable length
which which means a factor of actor of layers and will define layer appear to
be a vector of neurons
now what that does
is that let's just say and i i put the m under barred from players there because
a lot of people like him
name there
private data members in such a way that it's
readily visible in the program that this is a private data number and if you'd
like to do that and you know
you can do that
uh... so guess what this let's assiduous let's let's just uh... reference and
letters remembered
with to subscripts if you put one subscript attributes specified later
and if you put another subscript after that he specifying a specific meron
that later
and in order for this type deaf appeared to work in the exact
to have at least afford reference to class meron the destiny before
definition of meron but at least just a
uh... an incomplete ford reference like that
and then that makes the type work
okay now we can go ahead mister writing this um... constructor
and you'll see what i mean about
accessing the m under barred layers member here
so to uh...
to define the
body of the constructor we could do that in line right here in the class net
in order to keep the class in that declaration night since mauling cleaning
easy to read
i'm going to define the net constructor here outside of the class net
and to do that or copy the
declaration of the constructor
pasted down here after the class and the net
colin colin
classname in front of it
and then we can start writing the body
we need to know the number of players in the number of layers is given by
that incoming apology object
and we can call the dots ice member on that to get the number of elements and
apology in opal
or put that into a very able here called num lawyers because
it's easier than to talk about them layers and apology dot size
as just for convenience and for documentation purposes
and that lets us write a loop it takes layer number from zero up through the
number of lawyers
and inside this looping each iteration we want to create a new layer and add
that to be m under barred layers
to do that
we right in front of our letters
and then we call this number functional pushback that's the standard
container member function that you use to attend a new element onto
a standard
and that type of element that we wanted
want to attend onto that is and later
object which we expect to have to be a container
and to construct one of those he just give the name of the tight they're layer
along with the parentheses for the constructor
and then that line means create a new empty layer
and and that onto our employers
structure our employers container
so we've got a new later
no we need
another intergroup
that goes through in ads the individual neurons into that later
so we can write a loop
and will call us when under our number get started zero and it goes up through
the number of neurons in the slayer and
what's that more that's given
insiders apology
argument that comes in here
and if we access the layer num
element as a hard one to say layer number-three element
then that will be in the number of neurons in that letter
you notice here that the sloops says
norton a wrong number goes up
to less than or equal to that day
normally would just say
from zero to less than that value if you wanted exactly that number of iterations
but because we want to add that one extra biased miranti n
that's in addition to the number of neurons specified in technology women
will say that the locals less than equal to an end than that makes a loop one
additional time
and that will add that one extra pipes meron them onto the layer
that what we want to do is we want to
create a new neurone
and what we have even defined question around yet but
that's ok we can still write this line here that that create some
we want to create a new
your honor i object and then we want to attend that onto the layer that week
got ben creating
in order to
address the layer that we just got done creating which is the most recent one
appended onto emulators we say in layers dot back
dot back is the standard
container member function that gives you the most
recently a blast
the the last element there admit in the container
once we have the last
a woman in the container that is a
and we want to attend something to that lawyers so we said dot pushback
and that's pushing something back onto the layer container and i think we're
pushing back onto that is a newly constructed neuron which we construct by
huron and
constructor we'll probably have some constructor arguments a little airbus
the sketch the started for now
cc the direction that we're going here
here's a little
diagram already of a class diagram that shows that we have a class snapped
that has layers and the layers has neurons
move elaborate on that as we go
now let's try just dumb a fast little compile here and this is something i
encourage you to do often write a few lines strike up i'll see what sort of
compiler messages you get because
uh... you might get some
ideas of
you know in different directions that you want to go before you've got too far
down that direction
in order to compile a program like this at this early stage
we need to do a few little things we need to give
at least antibodies to these uh... member functions here or else the laker
will compile
or complain additional bodies
also class meron needs at least an antibody
and then in order to see that our program is
running and doing something that looks actually create
uh... annette
that consists of three layers
three input neurons want help put into hidden neurons and one hidden layers
like this
and by saying typology got pushed back we're heading
worded in defining those three elements into apology passing that and it can to
keep constructor of my neck
and then appear inside the nets constructor
each time it it adds a new neuron onto the layer let's stop with the statement
that says made an error on
and will use a c l_ statement to do that
in order to usa gallup of course we need to come up here and include ideal stream
nowadays everything necessary for this to compile i hope to keep your fingers
i'll drag this command window over here in order to do the compiled
and it will say will use cheap plus plus which is the
the standard compiler here on on this particular environment
we'll keep our program and help put the executed bowlers big-name neural net
and ran into didn't have the compiler so let's run programs to happen sam
without our help but statements here
and make sure we got the right number here
uh... we got
four which means we
the three input neurons we we requested plus one extra for the bias
the middle hidden layer has to neurons plus one by us and they'll put has to
neurons and one bias and you know it's
will never accessed that bias in iran on the calculator because nothing feedjit
and it doesn't even thing else but
it's just convenient to have all the layers consists of the so it makes it
easier to
to remember how to write the loops
so now we need to fill in just a little bit more detailed on these classes
so let's clean up the code here a little bit
and let's make a place for writing our class neuro and we need to leave
this ford reference of the year for the
day use of this tight debt for later it needs to have that ford reference but we
can define all of your own up here because
will see that we need to refer to later
their their mutually
meron n layers so
will give the ford reference for dinner on here so that
uh... the type effort layer can work now we can write the whole
definition further on down here
so class meron will be a typical class with the public section of the private
and we know that the uh... public section will have at least the
constructor we don't know the arguments yet if it means any
and we know that under ron has a little bit of private data uh... the main
principal most important piece of data that houses
its output value
which is just simply a double
so we'll put that there
also note here's a good place here to think about all those connection weights
the connection which could be handled in a lot of different ways he could have a
completely separate data structure pro opens connection weights
and then if the separate data structure specified explicitly listener on from
this meron to and then here's the weight than you could come up with
uh... weird connection
you know you could have feedback and and sparsely connected nets and things like
that but
for for our tutorial here
are implicitly connected and fully connected net
we don't need all that information we just need
to store
within each year on the weeks ago from yet
to all of the other neurons that it feeds so here
what we could do is make a picture of doubles
will call a couple of weeks
so that there is
an element here and i'll put weights for each of the neurons in the later to the
right div
looked at your own feeds
i found though
since writing this before that will reach a point where we need also to
store one more pieces of information for each weight and that's the change in
and that's something that the momentum calculation uses to implement momentum
so since we need to sort of doubles
for each output connection i'm going to say that this will be a factor of
and then up here will make a strapped connection the contains those two things
weight and delta waves
and that makes it just a little bit easier to uh...
to uh... i dress
but you could have two completely separate containers of actor doubles for
the weights and then a another container for the delta weights
um... there other alternatives like that that that would work out
uh... equally well
so that their own in its constructor it needs to construct uh... the specter of
which means we need a little loop in its constructor the coast ruin
creates the the right number of connections
have assist neurons constructor how would
how will that know the number
of connections to make because it really doesn't know anything about the next
layer unless we taught
so the minimum amount of information we need to tell it
about the next layer in order for to do its job here's the number
of neurons there are in the next layer it does need to know anything else just
the number
so if we pass in the number of outputs that this townhouse
then that's not forget to know how to create
it's uh...
uh... itself but wait
uh... container that means it down here
d_n_ this place where we quarreled or we create the neurons we need to pass in
the number of allpolitics
and let's assume that we have a very bold here called number outputs
uh... how would we get that very well
when we could define it right here after we
given to the loop of each layer and define the number of outputs to be each
the number of output
it's a little different depending on whether it's an output later or hidden
layer because they help elaire has no
further outputs
if the layer number in this loop
is the output later which it is the highest
later number minus one for the number of players
minus one which is the highest layer number
if it is equal to that
then the later the end of the number of helpless as a pro
otherwise and all other cases the number of outputs is
whatever e_s_p_n_ that element of apology
for the next layer over
so that we got the number of outlets would pass that into the neurons
we can come back up here to were writing class neurotic and do a little bit
little bit more work on that constructor there
so let's let's write that constructor uh...
to do that will copy that declaration for the right underneath the classes
like uh... writing new member functions right underneath the class like this
then you can see the call class declaration right above as your
composing the code for the
remember that your writing
to a peace that their well i have been around calling call named frank
and now we can write the body of the constructor
ineed salute
uh... will call the sloops c for connections
a loophole connections from zero uh... ought to not including them outputs
an inside each iteration
we will cap and a which means were pushing back
a new element onto the help of weights container and i think we're pushing on
is the new connection
structure which we make buying giving the tightening down an empty
like that
and then also at the same time
we also want to set that weight that's inside that connection to something
so will dump
there's an alternative here you could make connection
uh... class that has its own constructors that gives itself randomly
whenever it's constructed
that would be a perfectly
looked legit legitimate solution to this as well
but for this year i'm going to say that
connection is just a dump structure so we have to set its member here
uh... to set a standard here wilson will uh... address the new connection that
we've just
made by dot mac
and then address its individual we've member
and will sign it random weight and and what what what's in the random weights
from a function we could just put
a call to a random number generator right here in line but
this is this is something that you might want fiddle with later so we'll put it
in a function so that it's easier to kisi or two
to uh... to change and to experiment with
in the private section of cluster on will say static
and returns double random weight
takes nothing there's an argument
and will heighten and will what is right the whole
function body right here
and here's the magic incantations for returning at random number between zero
and why do you take
which can be big numbers
anne divided by
the maximum ram number and that gives you something between zero and one
and then in order to use brand we need to include a standard header
c standard pl
that's it for the constructor
for under on
let's go back down here to class net
this is a higher level class
and let's take a look at his feet forward to the squirrel putting inputs
into the class and feeding them for it
was to find this the sound member function right underneath here
underneath a class nap
what does it need to do
here's a good place to point out something since we're doing this kind of
ninja rapid prototyping were opening ourselves up to all sorts of different
kinds of errors etc
in production code you if you would want it
glitter this thing all over the place with error checking and an air handling
an error recovery
but in the sky prototyping we want
we want to maybe something like that but not spend a whole lot of time in there
here's something that you can do
that's really useful drew prototyping you can use the assert statement built
in this e plus plus
and this is where you assert
what you believe to be true this point and if during the program running them
into that is not true you'll find out with a with runtime error message
so what we're going to a certain at this point for example is is uh... is an
easier that could happen in we want to check for it
we want to check to see that the number
of elements in input files
is the same isn't the number of input neurons we have
otherwise in you don't feed for doesn't really have anything resume
reasonable that it can do
so he needs to assert
at the number of elements in input of the whole switches input files dot size
is equal to
the number of elements in the input layer and input layer is our m layers
element number zero
and that input layer is a factor of neurons so to get the number of neurons
dot seismic gives us a number for ron's
now remember that there's a bit extra bias meron in each layer so in order to
account for that will subtract one images of gives us the number of them
but not so if the number of input values is
the same as the number of template neurons
were good
now in order to use a certainly will appear and include the standard better
see a search
and we can continue writing defeat for dinner function here now
so now the next thing that feedforward needs to do is is to take the values
from the above tells argument and lashed those into the inputting around so we
need a loop that goes through every input value
three to put your hunt
that will call that i for import
web inputs will go from zero through
the number of input values that we have
and then inside each iteration
we need to uh... assignment
their value to the input not so
to address
an individual neurons in an individual layer
we right in
underwire laters
eyes element zero his input layer
element quietly into layer is the i think meron of the input
no we don't have
here at this point
to set itself put value because the neurons output value is a private number
affine class neurons in the so-called snafu or working in
in order to do that let's assume
we're going to have a member function in neurons which will will write later
called set up a value
and that wouldn't that way that gives
uh... class net eighty eight legal wade into uh... set me up and i
so that's a little excessive function will write set up a he would get output
value and and those took last year i hear it in a minute
but assuming that the that that is there an available to us
medicare will disable workgroup we're going to set the output value offender
the icy element of the input valves
and that's it for that that loop the ones that loop has all the input
input elements set like that now we can do the four propagation
four propagation means in this case it means a looping through each layer
and then inside their looping through each in iran in the layer and then
telling each individual meron to please feedforward
so directly out of will say uh... layer number coz
uh... from one this time we're skipping the implants because the inputs are
already set
were starting with the first hidden layer which is one
we want to look to go up through
and including the output layer so that goes through um... burn uh... layer
number then less than the number of elements in and in layers
temen inside
there to write the inner loop the loops to teach meron mplayer we say that
inferno on will go from zero through
the number of neurons in that letter
and because of the bias meron
we're going to do a minus one here
and inside their and we want to tell each year on them to do this
defeat for two to address in individual meron weekly say in layers
and the first indexes the layer number in the next indexes under a number
and let's assume
that we're going to right member function
on class you're on cold feet forty has its own feedforward in other words
that does the nitty-gritty mathematical stuffed inside it
that updates itself but value
subclass net
is more uh... interested in getting the the loops
to the players in the neurons right
and class meron will be responsible for doing the math yet and you can come to
see the division of responsibilities now better
dishonor medically falling out from
from the direction they were going here
the seat four d dumb
any kind of argument well
let's think about that one class matters past
to feed ford
uh... it's going to need to add up all of its input values and then apply
function to it
to upgrade its output value
in order to get as input values
it needs to ask
the neurons in the preceding layer what is alpha values are
uh... so it's it's going to need away to look through
all the neurons in the previous letter
so in order to do that we could
give class neurotic a lot of smarts a lot of disability and into into the hole
all arrangement of all the neurons
in other words we could make last nap in class uh...
neuron friends of each other
that week last neurone could look inside class net and c all about sam layers
data structure
classroom doesn't need to know that much
classroom needs no glass and what we could do is give class meron just a hand
or a pointer
to the neurons in the previous later and the and that's all it that's all it
let's assume
at this point
that will have
uh... of reference to the previous later
which is a container of neurons that we can pass here
and that's all that feed forward in that class would need in order to get
previous later
we can define previous layer appear as a reference to
the previous layer of which is in layers
indexed by
the current
later now minus one
and that way that creates a reference to an end creating references way by
singing a person in front of previous layer
that's very fascinated to execute insists it's just a couple of
of um...
uh... machine instructions this very fast were not copying the whole
uh... container here we're just making
a pointer underneath the covers his sister pointer
to that previous later
and then we're passing it into
uh... the feedforward class piranha
so that means up here in class meron before we forget it let's make a note of
care that weren't we are going to need at a member function in here called
and it's going to take it
in reference to
he later and that will call the previous layer
feedforward doesn't need to modify anything in that
container so we can const qualified if you want
and also remember room will also are going to need those access functions
appears set up a value and get out of values of buses going put those in right
now completely defined in here just begun with those
set output value will return them
void it will take a uh... double
and then all it needs to do will discredit in line right here because
it's so simple
and it sets itself would value to whatever that argument
and then the access here to get what returned it
and not take anything
and it returns appetite it's that easy
and since get output value doesn't do anything to modify the object we could
say that that whole function is a constant function
so now
let's think about the neurons speed four to get the class mets feedforward that
look to the layers and and neurons
and then if they let delegated the rest of the operation to this class neurons
feedforward so let's let's let's write it was the one that does the math part
so in them
in this feat ford member function
it's going to some of the input so will need a a a very bold to hold the
somewhat what this call it some
and it meets a loop through all of the neurons in the previous later
so will make a loop inferno ron
number and the meron index goes from zero
up to com but not including the dumped the number of elements in previous later
that also includes bios meron
um... by the way because the bias around us team
that's one of the inputs and garner on here
so then the some will accumulate
uh... the previous layered neurons
uh... output value at this point we either add access the output value
member because uh...
because we can we do have prevention permission to access that directly
or ghost
go through its public a p_i_
function like that either way
if you're a compiler does optimizations it will probably all end up being the
same thing either way that
that you do that
so then to access that uh... neuron in the previous lehrman index previous
later by manure on index number
and into to get its
that we need to multiply the the input value by we accesses up but wait
but now we need to index that
by something by
our invite are neurons number
okay now
aren't around here doesn't know what index number yeah
so it doesn't know which of the output weeks in that
other neuron that it needs to index here in order to get the weight of the
connection point for me it asked
so what what we need to do then is to
calorimeter on
what its indexes
so that our neuron here than those held index that
we're a
previous layer
solicit soon garner on is going to have a member of my index and wonder what
mine and x
let's assume it's available here and that we can use it here
assuming that
that means that one
we've got a look here the sums up all of the inputs time connection weights
before referred forget it looked at that my index up here to the your arms
constructor that's how it will get its indexes will pass it into the
constructor there
and uh... we'll add it there to pick extractors implementation
and we need to panic
uh... moving from the uh... constructors
incoming argument into a member
and uh... oh add that member to the private section of class dinner on that
animal passing on that index value norway
where we create the neurons down here in this little