Trees | Index | Help |
---|
Package Bio :: Package NeuralNetwork :: Package BackPropagation :: Module Layer :: Class OutputLayer |
|
AbstractLayer
--+
|
OutputLayer
Method Summary | |
---|---|
Initialize the Output Layer. | |
Calculate the backpropagation error at a given node. | |
Return the error value at a particular node. | |
Set a weight value from one node to the next. | |
Update the value of output nodes from the previous layers. | |
Inherited from AbstractLayer | |
Debugging output. |
Method Details |
---|
__init__(self,
num_nodes,
activation=<function logistic_function at 0x710c2070>)
|
backpropagate(self, outputs, learning_rate, momentum)Calculate the backpropagation error at a given node. This calculates the error term using the formula: p = (z - t) z (1 - z) where z is the calculated value for the node, and t is the real value. Arguments: o outputs - The list of output values we use to calculate the errors in our predictions. |
get_error(self, real_value, node_number)Return the error value at a particular node. |
set_weight(self, this_node, next_node, value)Set a weight value from one node to the next. If weights are not explicitly set, they will be initialized to random values to start with.
|
update(self, previous_layer)Update the value of output nodes from the previous layers. Arguments: o previous_layer -- The hidden layer preceeding this. |
Trees | Index | Help |
---|
Generated by Epydoc 2.1 on Thu Aug 10 20:05:37 2006 | http://epydoc.sf.net |