OpenANN
1.1.0
An open source library for artificial neural networks.
|
Represents an optimizable object. More...
#include <Optimizable.h>
Public Member Functions | |
virtual | ~Optimizable () |
virtual void | finishedIteration () |
This callback is called after each optimization algorithm iteration. More... | |
Batch Methods | |
Functions that must be implemented in every Optimizable. | |
virtual bool | providesInitialization ()=0 |
Check if the object knows how to initialize its parameters. More... | |
virtual void | initialize ()=0 |
Initialize the optimizable parameters. More... | |
virtual unsigned | dimension ()=0 |
Request the number of optimizable parameters. More... | |
virtual const Eigen::VectorXd & | currentParameters ()=0 |
Request the current parameters. More... | |
virtual void | setParameters (const Eigen::VectorXd ¶meters)=0 |
Set new parameters. More... | |
virtual double | error ()=0 |
Compute error on training set. More... | |
virtual bool | providesGradient ()=0 |
Check if the object provides a gradient of the error function with respect to its parameters. More... | |
virtual Eigen::VectorXd | gradient ()=0 |
Compute gradient of the error function with respect to the parameters. More... | |
Mini-batch Methods | |
Functions that should be implemented to speed up optimization and are required by some optimization algorithms. | |
virtual unsigned | examples () |
Request number of training examples. More... | |
virtual double | error (unsigned n) |
Compute error of a given training example. More... | |
virtual Eigen::VectorXd | gradient (unsigned n) |
Compute gradient of a given training example. More... | |
virtual void | errorGradient (int n, double &value, Eigen::VectorXd &grad) |
Calculates the function value and gradient of a training example. More... | |
virtual void | errorGradient (double &value, Eigen::VectorXd &grad) |
Calculates the function value and gradient of all training examples. More... | |
virtual Eigen::VectorXd | error (std::vector< int >::const_iterator startN, std::vector< int >::const_iterator endN) |
Calculates the errors of given training examples. More... | |
virtual Eigen::VectorXd | gradient (std::vector< int >::const_iterator startN, std::vector< int >::const_iterator endN) |
Calculates the accumulated gradient of given training examples. More... | |
virtual void | errorGradient (std::vector< int >::const_iterator startN, std::vector< int >::const_iterator endN, double &value, Eigen::VectorXd &grad) |
Calculates the accumulated gradient and error of given training examples. More... | |
Represents an optimizable object.
An optimizable object can be arbitrary as long as it provides some objective function. The objective function in this context is called error function and will be minimized. This could be e.g. the sum of squared error between predictions and targets for a neural network. But the idea behind this is more general. We can e.g. optimize the reward in a reinforcement learning problem or the energy of an unsupervised model like an RBM.
|
inlinevirtual |
|
pure virtual |
Request the current parameters.
Implemented in OpenANN::Net, OpenANN::RBM, OpenANN::IntrinsicPlasticity, NeuroEvolutionAgent, and OpenANN::SparseAutoEncoder.
|
pure virtual |
Request the number of optimizable parameters.
Implemented in OpenANN::Net, OpenANN::RBM, NeuroEvolutionAgent, OpenANN::IntrinsicPlasticity, and OpenANN::SparseAutoEncoder.
|
pure virtual |
Compute error on training set.
Implemented in OpenANN::Net, OpenANN::RBM, NeuroEvolutionAgent, OpenANN::IntrinsicPlasticity, and OpenANN::SparseAutoEncoder.
|
inlinevirtual |
Compute error of a given training example.
n | index of the training example in the dataset |
|
virtual |
Calculates the errors of given training examples.
startN | iterator over index vector |
endN | iterator over index vector |
|
virtual |
Calculates the function value and gradient of a training example.
n | index of training example |
value | function value |
grad | gradient of the function, lenght must be dimension() |
Reimplemented in OpenANN::Net.
|
virtual |
Calculates the function value and gradient of all training examples.
value | function value |
grad | gradient of the function, lenght must be dimension() |
Reimplemented in OpenANN::Net, and OpenANN::SparseAutoEncoder.
|
virtual |
Calculates the accumulated gradient and error of given training examples.
startN | iterator over index vector |
endN | iterator over index vector |
value | function value |
grad | gradient of the function, lenght must be dimension() |
Reimplemented in OpenANN::Net, and OpenANN::RBM.
|
inlinevirtual |
Request number of training examples.
Reimplemented in OpenANN::Net, OpenANN::RBM, and OpenANN::IntrinsicPlasticity.
|
inlinevirtual |
This callback is called after each optimization algorithm iteration.
Reimplemented in OpenANN::Net.
|
pure virtual |
Compute gradient of the error function with respect to the parameters.
Implemented in OpenANN::Net, OpenANN::RBM, OpenANN::IntrinsicPlasticity, NeuroEvolutionAgent, and OpenANN::SparseAutoEncoder.
|
inlinevirtual |
Compute gradient of a given training example.
n | index of the training example in the dataset |
|
virtual |
Calculates the accumulated gradient of given training examples.
startN | iterator over index vector |
endN | iterator over index vector |
|
pure virtual |
Initialize the optimizable parameters.
Implemented in OpenANN::Net, OpenANN::RBM, NeuroEvolutionAgent, OpenANN::IntrinsicPlasticity, and OpenANN::SparseAutoEncoder.
|
pure virtual |
Check if the object provides a gradient of the error function with respect to its parameters.
Implemented in OpenANN::Net, OpenANN::RBM, NeuroEvolutionAgent, OpenANN::IntrinsicPlasticity, and OpenANN::SparseAutoEncoder.
|
pure virtual |
Check if the object knows how to initialize its parameters.
Implemented in OpenANN::Net, OpenANN::RBM, NeuroEvolutionAgent, OpenANN::IntrinsicPlasticity, and OpenANN::SparseAutoEncoder.
|
pure virtual |
Set new parameters.
parameters | new parameters |
Implemented in OpenANN::Net, OpenANN::RBM, NeuroEvolutionAgent, OpenANN::IntrinsicPlasticity, and OpenANN::SparseAutoEncoder.