|
||||||||||
PREV NEXT | FRAMES NO FRAMES |
Uses of Propagation in org.encog.neural.networks.training.propagation |
---|
Constructors in org.encog.neural.networks.training.propagation with parameters of type Propagation | |
---|---|
GradientWorker(FlatNetwork theNetwork,
Propagation theOwner,
MLDataSet theTraining,
int theLow,
int theHigh,
double[] flatSpot,
ErrorFunction ef)
Construct a gradient worker. |
Uses of Propagation in org.encog.neural.networks.training.propagation.back |
---|
Subclasses of Propagation in org.encog.neural.networks.training.propagation.back | |
---|---|
class |
Backpropagation
This class implements a backpropagation training algorithm for feed forward neural networks. |
Uses of Propagation in org.encog.neural.networks.training.propagation.manhattan |
---|
Subclasses of Propagation in org.encog.neural.networks.training.propagation.manhattan | |
---|---|
class |
ManhattanPropagation
One problem that the backpropagation technique has is that the magnitude of the partial derivative may be calculated too large or too small. |
Uses of Propagation in org.encog.neural.networks.training.propagation.quick |
---|
Subclasses of Propagation in org.encog.neural.networks.training.propagation.quick | |
---|---|
class |
QuickPropagation
QPROP is an efficient training method that is based on Newton's Method. |
Uses of Propagation in org.encog.neural.networks.training.propagation.resilient |
---|
Subclasses of Propagation in org.encog.neural.networks.training.propagation.resilient | |
---|---|
class |
ResilientPropagation
One problem with the backpropagation algorithm is that the magnitude of the partial derivative is usually too large or too small. |
Uses of Propagation in org.encog.neural.networks.training.propagation.scg |
---|
Subclasses of Propagation in org.encog.neural.networks.training.propagation.scg | |
---|---|
class |
ScaledConjugateGradient
This is a training class that makes use of scaled conjugate gradient methods. |
|
||||||||||
PREV NEXT | FRAMES NO FRAMES |