#include <itkAdaptiveStochasticGradientDescentOptimizer.h>
This class implements a gradient descent optimizer with adaptive gain.
If is a costfunction that has to be minimised, the following iterative algorithm is used to find the optimal parameters :
The gain at each iteration is defined by:
.
And the time is updated according to:
where equals at iteration . For the InitialTime is used, which is defined in the the superclass (StandardGradientDescentOptimizer). Whereas in the superclass this parameter is superfluous, in this class it makes sense.
This method is described in the following references:
[1] P. Cruz, "Almost sure convergence and asymptotical normality of a generalization of Kesten's stochastic approximation algorithm for multidimensional case." Technical Report, 2005. http://hdl.handle.net/2052/74
[2] S. Klein, J.P.W. Pluim, and M. Staring, M.A. Viergever, "Adaptive stochastic gradient descent optimisation for image registration," International Journal of Computer Vision, vol. 81, no. 3, pp. 227-239, 2009. http://dx.doi.org/10.1007/s11263-008-0168-y
It is very suitable to be used in combination with a stochastic estimate of the gradient . For example, in image registration problems it is often advantageous to compute the metric derivative ( ) on a new set of randomly selected image samples in each iteration. You may set the parameter NewSamplesEveryIteration
to "true"
to achieve this effect. For more information on this strategy, you may have a look at:
Definition at line 72 of file itkAdaptiveStochasticGradientDescentOptimizer.h.
Static Public Member Functions | |
static Pointer | New () |
Static Public Member Functions inherited from itk::StandardGradientDescentOptimizer | |
static Pointer | New () |
Static Public Member Functions inherited from itk::GradientDescentOptimizer2 | |
static Pointer | New () |
Static Public Member Functions inherited from itk::ScaledSingleValuedNonLinearOptimizer | |
static Pointer | New () |
Protected Member Functions | |
AdaptiveStochasticGradientDescentOptimizer () | |
void | UpdateCurrentTime () override |
~AdaptiveStochasticGradientDescentOptimizer () override=default | |
Protected Member Functions inherited from itk::StandardGradientDescentOptimizer | |
virtual double | Compute_a (double k) const |
StandardGradientDescentOptimizer () | |
~StandardGradientDescentOptimizer () override=default | |
Protected Member Functions inherited from itk::GradientDescentOptimizer2 | |
GradientDescentOptimizer2 () | |
void | PrintSelf (std::ostream &os, Indent indent) const override |
~GradientDescentOptimizer2 () override=default | |
Protected Member Functions inherited from itk::ScaledSingleValuedNonLinearOptimizer | |
virtual void | GetScaledDerivative (const ParametersType ¶meters, DerivativeType &derivative) const |
virtual MeasureType | GetScaledValue (const ParametersType ¶meters) const |
virtual void | GetScaledValueAndDerivative (const ParametersType ¶meters, MeasureType &value, DerivativeType &derivative) const |
void | PrintSelf (std::ostream &os, Indent indent) const override |
ScaledSingleValuedNonLinearOptimizer () | |
void | SetCurrentPosition (const ParametersType ¶m) override |
virtual void | SetScaledCurrentPosition (const ParametersType ¶meters) |
~ScaledSingleValuedNonLinearOptimizer () override=default | |
Protected Attributes | |
DerivativeType | m_PreviousGradient {} |
Protected Attributes inherited from itk::StandardGradientDescentOptimizer | |
double | m_CurrentTime { 0.0 } |
bool | m_UseConstantStep { false } |
Protected Attributes inherited from itk::GradientDescentOptimizer2 | |
DerivativeType | m_Gradient {} |
DerivativeType | m_SearchDirection {} |
StopConditionType | m_StopCondition { MaximumNumberOfIterations } |
Protected Attributes inherited from itk::ScaledSingleValuedNonLinearOptimizer | |
ScaledCostFunctionPointer | m_ScaledCostFunction {} |
ParametersType | m_ScaledCurrentPosition {} |
Private Attributes | |
double | m_SigmoidMax { 1.0 } |
double | m_SigmoidMin { -0.8 } |
double | m_SigmoidScale { 1e-8 } |
bool | m_UseAdaptiveStepSizes { true } |
using itk::AdaptiveStochasticGradientDescentOptimizer::ConstPointer = SmartPointer<const Self> |
Definition at line 82 of file itkAdaptiveStochasticGradientDescentOptimizer.h.
using itk::AdaptiveStochasticGradientDescentOptimizer::Pointer = SmartPointer<Self> |
Definition at line 81 of file itkAdaptiveStochasticGradientDescentOptimizer.h.
using itk::AdaptiveStochasticGradientDescentOptimizer::Self = AdaptiveStochasticGradientDescentOptimizer |
Standard ITK.
Definition at line 78 of file itkAdaptiveStochasticGradientDescentOptimizer.h.
using itk::AdaptiveStochasticGradientDescentOptimizer::Superclass = StandardGradientDescentOptimizer |
Definition at line 79 of file itkAdaptiveStochasticGradientDescentOptimizer.h.
|
protected |
|
overrideprotecteddefault |
|
virtual |
Run-time type information (and related methods).
Reimplemented from itk::StandardGradientDescentOptimizer.
Reimplemented in elastix::AdaptiveStochasticGradientDescent< TElastix >.
|
virtual |
|
virtual |
|
virtual |
|
virtual |
itk::AdaptiveStochasticGradientDescentOptimizer::ITK_DISALLOW_COPY_AND_MOVE | ( | AdaptiveStochasticGradientDescentOptimizer | ) |
|
static |
Method for creation through the object factory.
|
virtual |
Set/Get the maximum of the sigmoid. Should be >0. Default: 1.0
|
virtual |
Set/Get the maximum of the sigmoid. Should be <0. Default: -0.8
|
virtual |
Set/Get the scaling of the sigmoid width. Large values cause a more wide sigmoid. Default: 1e-8. Should be >0.
|
virtual |
Set/Get whether the adaptive step size mechanism is desired. Default: true
|
overrideprotectedvirtual |
Function to update the current time If UseAdaptiveStepSizes is false this function just increments the CurrentTime by . Else, the CurrentTime is updated according to:
time = max[ 0, time + sigmoid( -gradient*previousgradient) ]
In that case, also the m_PreviousGradient is updated.
Reimplemented from itk::StandardGradientDescentOptimizer.
|
protected |
The PreviousGradient, necessary for the CruzAcceleration
Definition at line 134 of file itkAdaptiveStochasticGradientDescentOptimizer.h.
|
private |
Definition at line 139 of file itkAdaptiveStochasticGradientDescentOptimizer.h.
|
private |
Definition at line 140 of file itkAdaptiveStochasticGradientDescentOptimizer.h.
|
private |
Definition at line 141 of file itkAdaptiveStochasticGradientDescentOptimizer.h.
|
private |
Settings
Definition at line 138 of file itkAdaptiveStochasticGradientDescentOptimizer.h.
Generated on 2024-07-17 for elastix by 1.11.0 (9b424b03c9833626cd435af22a444888fbbb192d) |