Modifier and Type | Class and Description |
---|---|
class |
HessianCR
Calculate the Hessian matrix using the chain rule method.
|
Modifier and Type | Class and Description |
---|---|
class |
ParallelScore
This class is used to calculate the scores for an entire population.
|
Modifier and Type | Class and Description |
---|---|
class |
BasicEA
Provides a basic implementation of a multi-threaded Evolutionary Algorithm.
|
class |
TrainEA
Provides a MLTrain compatible class that can be used to train genomes.
|
Modifier and Type | Class and Description |
---|---|
class |
MLMethodGeneticAlgorithm
Implements a genetic algorithm that allows an MLMethod that is encodable
(MLEncodable) to be trained.
|
class |
MLMethodGeneticAlgorithm.MLMethodGeneticAlgorithmHelper
Very simple class that implements a genetic algorithm.
|
Modifier and Type | Class and Description |
---|---|
class |
AbstractPrgGenerator
The abstract base for Full and Grow program generation.
|
class |
PrgFullGenerator
The full generator works by creating program trees that do not stop
prematurely.
|
class |
PrgGrowGenerator
The grow generator creates a random program by choosing a random node from
both the "function and terminal" sets until the maximum depth is reached.
|
class |
RampedHalfAndHalf
Because neither the grow or full method provide a very wide array of sizes or
shapes on their own, Koza (1992) proposed a combination called ramped
half-and-half.
|
Modifier and Type | Class and Description |
---|---|
class |
LevenbergMarquardtTraining
Trains a neural network using a Levenberg Marquardt algorithm (LMA).
|
Modifier and Type | Class and Description |
---|---|
class |
Propagation
Implements basic functionality that is needed by each of the propagation
methods.
|
Modifier and Type | Class and Description |
---|---|
class |
Backpropagation
This class implements a backpropagation training algorithm for feed forward
neural networks.
|
Modifier and Type | Class and Description |
---|---|
class |
ManhattanPropagation
One problem that the backpropagation technique has is that the magnitude of
the partial derivative may be calculated too large or too small.
|
Modifier and Type | Class and Description |
---|---|
class |
QuickPropagation
QPROP is an efficient training method that is based on Newton's Method.
|
Modifier and Type | Class and Description |
---|---|
class |
ResilientPropagation
One problem with the backpropagation algorithm is that the magnitude of the
partial derivative is usually too large or too small.
|
Modifier and Type | Class and Description |
---|---|
class |
ScaledConjugateGradient
This is a training class that makes use of scaled conjugate gradient methods.
|
Modifier and Type | Class and Description |
---|---|
class |
PruneIncremental
This class is used to help determine the optimal configuration for the hidden
layers of a neural network.
|
Modifier and Type | Class and Description |
---|---|
class |
EngineConcurrency
This class abstracts thread pools, and potentially grids and other types of
concurrency.
|
Modifier and Type | Class and Description |
---|---|
class |
ConcurrentJob
This class forms the basis for a job that can be run concurrently.
|
Copyright © 2014. All Rights Reserved.