Uses of Class
org.joone.engine.Layer

Packages that use Layer
org.joone.engine   
org.joone.net   
org.joone.structure   
 

Uses of Layer in org.joone.engine
 

Subclasses of Layer in org.joone.engine
 class BiasedLinearLayer
          This layer consists of linear neurons, i.e. neurons that sum up their inputs (actually this is done by the (full) synapse in Joone) along with their biases.
 class ContextLayer
          The context layer is similar to the linear layer except that it has an auto-recurrent connection between its output and input.
 class DelayLayer
          Delay unit to create temporal windows from time series
O---> Yk(t-N)
|
...
 class GaussianLayer
          This layer implements the Gaussian Neighborhood SOM strategy.
 class GaussLayer
          The output of a Gauss(ian) layer neuron is the sum of the weighted input values, applied to a gaussian curve (exp(- x * x)).
 class LinearLayer
          The output of a linear layer neuron is the sum of the weighted input values, scaled by the beta parameter.
 class LogarithmicLayer
          This layer implements a logarithmic transfer function.
 class MemoryLayer
           
 class RbfGaussianLayer
          This class implements the nonlinear layer in Radial Basis Function (RBF) networks using Gaussian functions.
 class RbfLayer
          This is the basis (helper) for radial basis function layers.
 class SigmoidLayer
          The output of a sigmoid layer neuron is the sum of the weighted input values, applied to a sigmoid function.
 class SimpleLayer
          This abstract class represents layers that are composed by neurons that implement some transfer function.
 class SineLayer
          The output of a sine layer neuron is the sum of the weighted input values, applied to a sine (sin(x)).
 class SoftmaxLayer
          The outputs of the Softmax layer must be interpreted as probabilities.
 class TanhLayer
          Layer that applies the tangent hyperbolic transfer function to its input patterns
 class WTALayer
          This layer implements the Winner Takes All SOM strategy.
 

Uses of Layer in org.joone.net
 

Subclasses of Layer in org.joone.net
 class NestedNeuralLayer
           
 

Methods in org.joone.net that return Layer
 Layer[] NeuralNet.calculateOrderedLayers()
          This method calculates the order of the layers of the network, from the input to the output.
 Layer NeuralNet.findInputLayer()
          Returns the input layer, by searching for it following the rules written in Layer.isInputLayer.
 Layer NeuralNet.findOutputLayer()
          Returns the output layer by searching for it following the rules written in Layer.isOutputLayer.
 Layer NeuralNet.getInputLayer()
          Returns the input layer of the network.
 Layer NeuralNet.getLayer(java.lang.String layerName)
           
 Layer[] NeuralNet.getOrderedLayers()
           
 Layer NeuralNet.getOutputLayer()
          Returns the output layer of the network.
 

Methods in org.joone.net with parameters of type Layer
 void NeuralNet.addLayer(Layer layer)
           
 void NeuralNet.addLayer(Layer layer, int tier)
           
 void NeuralNet.removeLayer(Layer layer)
           
 void NeuralNet.setInputLayer(Layer newLayer)
           
 void NeuralNet.setOrderedLayers(Layer[] orderedLayers)
          This method permits to set externally a particular order to traverse the Layers.
 void NeuralNet.setOutputLayer(Layer newLayer)
           
 

Uses of Layer in org.joone.structure
 

Methods in org.joone.structure that return Layer
protected  Layer Nakayama.findInputLayer(Synapse aSynapse)
          Finds the input layer of a synapse.
protected  Layer Nakayama.findOutputLayer(Synapse aSynapse)
          Finds the output layer of a synapse.
 

Methods in org.joone.structure with parameters of type Layer
 void Nakayama.addLayer(Layer aLayer)
          Adds layers to this optimizer.
protected  double Nakayama.getSumAbsoluteWeights(Layer aLayer, int aNeuron)
          Sums up all the absolute values of the output weights of a neuron within a layer.
 



Submit Feedback to pmarrone@users.sourceforge.net