#include <itkAdaptiveStochasticVarianceReducedGradientOptimizer.h>
This class implements a gradient descent optimizer with adaptive gain.
If 

![\[ x(k+1) = x(k) - a(t_k) dC/dx \]](form_69.png)
The gain 

![\[ a(t_k) = a / (A + t_k + 1)^alpha \]](form_71.png)
.
And the time 
![\[ t_{k+1} = [ t_k + sigmoid( -g_k^T g_{k-1} ) ]^+ \]](form_73.png)
where 



This method is described in the following references:
[1] P. Cruz, "Almost sure convergence and asymptotical normality of a generalization of Kesten's stochastic approximation algorithm for multidimensional case." Technical Report, 2005. http://hdl.handle.net/2052/74
[2] S. Klein, J.P.W. Pluim, and M. Staring, M.A. Viergever, "Adaptive stochastic gradient descent optimisation for image registration," International Journal of Computer Vision, vol. 81, no. 3, pp. 227-239, 2009. http://dx.doi.org/10.1007/s11263-008-0168-y It is very suitable to be used in combination with a stochastic estimate of the gradient 

NewSamplesEveryIteration to "true" to achieve this effect. For more information on this strategy, you may have a look at:
If 

![\[ x(k+1) = x(k) - a(t_k) dC/dx \]](form_69.png)
The gain 

![\[ a(t_k) = a / (A + t_k + 1)^alpha \]](form_71.png)
.
And the time 
![\[ t_{k+1} = [ t_k + sigmoid( -g_k^T g_{k-1} ) ]^+ \]](form_73.png)
where 



This method is described in the following references:
[1] P. Cruz, "Almost sure convergence and asymptotical normality of a generalization of Kesten's stochastic approximation algorithm for multidimensional case." Technical Report, 2005. http://hdl.handle.net/2052/74
[2] S. Klein, J.P.W. Pluim, and M. Staring, M.A. Viergever, "Adaptive stochastic gradient descent optimisation for image registration," International Journal of Computer Vision, vol. 81, no. 3, pp. 227-239, 2009. http://dx.doi.org/10.1007/s11263-008-0168-y
It is very suitable to be used in combination with a stochastic estimate of the gradient 

NewSamplesEveryIteration to "true" to achieve this effect. For more information on this strategy, you may have a look at:
Definition at line 70 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.

Static Public Member Functions | |
| static Pointer | New () |
| Static Public Member Functions inherited from itk::StandardStochasticVarianceReducedGradientOptimizer | |
| static Pointer | New () |
| Static Public Member Functions inherited from itk::StochasticVarianceReducedGradientDescentOptimizer | |
| static Pointer | New () |
| Static Public Member Functions inherited from itk::ScaledSingleValuedNonLinearOptimizer | |
| static Pointer | New () |
Protected Member Functions | |
| AdaptiveStochasticVarianceReducedGradientOptimizer () | |
| void | UpdateCurrentTime () override |
| ~AdaptiveStochasticVarianceReducedGradientOptimizer () override=default | |
| Protected Member Functions inherited from itk::StandardStochasticVarianceReducedGradientOptimizer | |
| virtual double | Compute_a (double k) const |
| virtual double | Compute_beta (double k) const |
| StandardStochasticVarianceReducedGradientOptimizer () | |
| ~StandardStochasticVarianceReducedGradientOptimizer () override=default | |
| Protected Member Functions inherited from itk::StochasticVarianceReducedGradientDescentOptimizer | |
| void | PrintSelf (std::ostream &os, Indent indent) const override |
| StochasticVarianceReducedGradientDescentOptimizer () | |
| ~StochasticVarianceReducedGradientDescentOptimizer () override=default | |
| Protected Member Functions inherited from itk::ScaledSingleValuedNonLinearOptimizer | |
| virtual void | GetScaledDerivative (const ParametersType ¶meters, DerivativeType &derivative) const |
| virtual MeasureType | GetScaledValue (const ParametersType ¶meters) const |
| virtual void | GetScaledValueAndDerivative (const ParametersType ¶meters, MeasureType &value, DerivativeType &derivative) const |
| void | PrintSelf (std::ostream &os, Indent indent) const override |
| ScaledSingleValuedNonLinearOptimizer () | |
| void | SetCurrentPosition (const ParametersType ¶m) override |
| virtual void | SetScaledCurrentPosition (const ParametersType ¶meters) |
| ~ScaledSingleValuedNonLinearOptimizer () override=default | |
Private Attributes | |
| double | m_SigmoidMax { 1.0 } |
| double | m_SigmoidMin { -0.8 } |
| double | m_SigmoidScale { 1e-8 } |
| bool | m_UseAdaptiveStepSizes { true } |
Additional Inherited Members | |
| Protected Types inherited from itk::StochasticVarianceReducedGradientDescentOptimizer | |
| using | ThreadInfoType = MultiThreaderBase::WorkUnitInfo |
| using itk::AdaptiveStochasticVarianceReducedGradientOptimizer::ConstPointer = SmartPointer<const Self> |
Definition at line 80 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
| using itk::AdaptiveStochasticVarianceReducedGradientOptimizer::Pointer = SmartPointer<Self> |
Definition at line 79 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
| using itk::AdaptiveStochasticVarianceReducedGradientOptimizer::Self = AdaptiveStochasticVarianceReducedGradientOptimizer |
Standard ITK.
Definition at line 76 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
| using itk::AdaptiveStochasticVarianceReducedGradientOptimizer::Superclass = StandardStochasticVarianceReducedGradientOptimizer |
Definition at line 77 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
|
protected |
|
overrideprotecteddefault |
|
virtual |
|
virtual |
|
virtual |
|
virtual |
| itk::AdaptiveStochasticVarianceReducedGradientOptimizer::ITK_DISALLOW_COPY_AND_MOVE | ( | AdaptiveStochasticVarianceReducedGradientOptimizer | ) |
| itk::AdaptiveStochasticVarianceReducedGradientOptimizer::itkOverrideGetNameOfClassMacro | ( | AdaptiveStochasticVarianceReducedGradientOptimizer | ) |
Run-time type information (and related methods).
|
static |
Method for creation through the object factory.
|
virtual |
Set/Get the maximum of the sigmoid. Should be >0. Default: 1.0
|
virtual |
Set/Get the maximum of the sigmoid. Should be <0. Default: -0.8
|
virtual |
Set/Get the scaling of the sigmoid width. Large values cause a more wide sigmoid. Default: 1e-8. Should be >0.
|
virtual |
Set/Get whether the adaptive step size mechanism is desired. Default: true
|
overrideprotectedvirtual |
Function to update the current time If UseAdaptiveStepSizes is false this function just increments the CurrentTime by 
time = max[ 0, time + sigmoid( -gradient*previousgradient) ]
In that case, also the m_PreviousGradient is updated.
Reimplemented from itk::StandardStochasticVarianceReducedGradientOptimizer.
|
protected |
The PreviousGradient, necessary for the CruzAcceleration
Definition at line 132 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
|
private |
Definition at line 137 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
|
private |
Definition at line 138 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
|
private |
Definition at line 139 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
|
private |
Settings
Definition at line 136 of file itkAdaptiveStochasticVarianceReducedGradientOptimizer.h.
Generated on 26-02-2026
for elastix by 1.16.1 (669aeeefca743c148e2d935b3d3c69535c7491e6) |