Package org.encog.engine.network.activation

Interface Summary
ActivationFunction This interface allows various activation functions to be used with the neural network.
 

Class Summary
ActivationBiPolar BiPolar activation function.
ActivationBipolarSteepenedSigmoid The bipolar sigmoid activation function is like the regular sigmoid activation function, except Bipolar sigmoid activation function.
ActivationClippedLinear Linear activation function that bounds the output to [-1,+1].
ActivationCompetitive An activation function that only allows a specified number, usually one, of the out-bound connection to win.
ActivationElliott Computationally efficient alternative to ActivationSigmoid.
ActivationElliottSymmetric Computationally efficient alternative to ActivationTANH.
ActivationGaussian An activation function based on the Gaussian function.
ActivationLinear The Linear layer is really not an activation function at all.
ActivationLOG An activation function based on the logarithm function.
ActivationRamp A ramp activation function.
ActivationSigmoid The sigmoid activation function takes on a sigmoidal shape.
ActivationSIN An activation function based on the sin function, with a double period.
ActivationSoftMax The softmax activation function.
ActivationSteepenedSigmoid The Steepened Sigmoid is an activation function typically used with NEAT.
ActivationStep The step activation function is a very simple activation function.
ActivationTANH The hyperbolic tangent activation function takes the curved shape of the hyperbolic tangent.
 



Copyright © 2014. All Rights Reserved.