org.encog.engine.network.activation
Class ActivationClippedLinear

java.lang.Object
  extended by org.encog.engine.network.activation.ActivationClippedLinear
All Implemented Interfaces:
Serializable, Cloneable, ActivationFunction

public class ActivationClippedLinear
extends Object
implements ActivationFunction

Linear activation function that bounds the output to [-1,+1]. This activation is typically part of a CPPN neural network, such as HyperNEAT. The idea for this activation function was developed by Ken Stanley, of the University of Texas at Austin. http://www.cs.ucf.edu/~kstanley/

See Also:
Serialized Form

Constructor Summary
ActivationClippedLinear()
           
 
Method Summary
 void activationFunction(double[] d, int start, int size)
          Implements the activation function.
 ActivationFunction clone()
          
 double derivativeFunction(double b, double a)
          Calculate the derivative.
 String getFactoryCode()
          
 String[] getParamNames()
          
 double[] getParams()
          
 boolean hasDerivative()
          
 void setParam(int index, double value)
          Set one of the params for this activation function.
 
Methods inherited from class java.lang.Object
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Constructor Detail

ActivationClippedLinear

public ActivationClippedLinear()
Method Detail

activationFunction

public void activationFunction(double[] d,
                               int start,
                               int size)
Implements the activation function. The array is modified according to the activation function being used. See the class description for more specific information on this type of activation function.

Specified by:
activationFunction in interface ActivationFunction
Parameters:
d - The input array to the activation function.
start - The starting index.
size - The number of values to calculate.

derivativeFunction

public double derivativeFunction(double b,
                                 double a)
Calculate the derivative. For performance reasons two numbers are provided. First, the value "b" is simply the number that we would like to calculate the derivative of. Second, the value "a", which is the value returned by the activation function, when presented with "b". We use two values because some of the most common activation functions make use of the result of the activation function. It is bad for performance to calculate this value twice. Yet, not all derivatives are calculated this way. By providing both the value before the activation function is applied ("b"), and after the activation function is applied("a"), the class can be constructed to use whichever value will be the most efficient.

Specified by:
derivativeFunction in interface ActivationFunction
Parameters:
b - The number to calculate the derivative of, the number "before" the activation function was applied.
a - The number "after" an activation function has been applied.
Returns:
The derivative.

hasDerivative

public boolean hasDerivative()

Specified by:
hasDerivative in interface ActivationFunction
Returns:
Return true if this function has a derivative.

getParams

public double[] getParams()

Specified by:
getParams in interface ActivationFunction
Returns:
The params for this activation function.

setParam

public void setParam(int index,
                     double value)
Set one of the params for this activation function.

Specified by:
setParam in interface ActivationFunction
Parameters:
index - The index of the param to set.
value - The value to set.

getParamNames

public String[] getParamNames()

Specified by:
getParamNames in interface ActivationFunction
Returns:
The names of the parameters.

clone

public final ActivationFunction clone()

Specified by:
clone in interface ActivationFunction
Overrides:
clone in class Object
Returns:
A cloned copy of this activation function.

getFactoryCode

public String getFactoryCode()

Specified by:
getFactoryCode in interface ActivationFunction
Returns:
The string for the Encog factory code. Return null if you do not care to be support for creating of your activation through factory.


Copyright © 2014. All Rights Reserved.