go home Home | Main Page | Modules | Namespace List | Class Hierarchy | Alphabetical List | Data Structures | File List | Namespace Members | Data Fields | Globals | Related Pages
itk::FiniteDifferenceGradientDescentOptimizer Class Reference

#include <itkFiniteDifferenceGradientDescentOptimizer.h>

Detailed Description

An optimizer based on gradient descent ...

If $C(x)$ is a costfunction that has to be minimised, the following iterative algorithm is used to find the optimal parameters x:

\[ x(k+1)_j = x(k)_j - a(k) \left[ C(x(k)_j + c(k)) - C(x(k)_j - c(k)) \right] / 2c(k), \]

for all parameters $j$.

From this equation it is clear that it a gradient descent optimizer, using a finite difference approximation of the gradient.

The gain $a(k)$ at each iteration $k$ is defined by:

\[ a(k) =  a / (A + k + 1)^{\alpha}. \]

The perturbation size $c(k)$ at each iteration $k$ is defined by:

\[ c(k) =  c / (k + 1)^{\gamma}. \]

Note the similarities to the SimultaneousPerturbation optimizer and the StandardGradientDescent optimizer.

See also
FiniteDifferenceGradientDescent

Definition at line 55 of file itkFiniteDifferenceGradientDescentOptimizer.h.

Inheritance diagram for itk::FiniteDifferenceGradientDescentOptimizer:

Public Types

using ConstPointer = SmartPointer<const Self>
 
using Pointer = SmartPointer<Self>
 
using Self = FiniteDifferenceGradientDescentOptimizer
 
enum  StopConditionType { MaximumNumberOfIterations , MetricError }
 
using Superclass = ScaledSingleValuedNonLinearOptimizer
 
- Public Types inherited from itk::ScaledSingleValuedNonLinearOptimizer
using ConstPointer = SmartPointer<const Self>
 
using Pointer = SmartPointer<Self>
 
using ScaledCostFunctionPointer = ScaledCostFunctionType::Pointer
 
using ScaledCostFunctionType = ScaledSingleValuedCostFunction
 
using ScalesType = NonLinearOptimizer::ScalesType
 
using Self = ScaledSingleValuedNonLinearOptimizer
 
using Superclass = SingleValuedNonLinearOptimizer
 

Public Member Functions

virtual void AdvanceOneStep ()
 
virtual void ComputeCurrentValueOff ()
 
virtual void ComputeCurrentValueOn ()
 
virtual const char * GetClassName () const
 
virtual bool GetComputeCurrentValue () const
 
virtual unsigned long GetCurrentIteration () const
 
virtual double GetGradientMagnitude () const
 
virtual double GetLearningRate () const
 
virtual unsigned long GetNumberOfIterations () const
 
virtual double GetParam_A () const
 
virtual double GetParam_a () const
 
virtual double GetParam_alpha () const
 
virtual double GetParam_c () const
 
virtual double GetParam_gamma () const
 
virtual StopConditionType GetStopCondition () const
 
virtual double GetValue () const
 
 ITK_DISALLOW_COPY_AND_MOVE (FiniteDifferenceGradientDescentOptimizer)
 
void ResumeOptimization ()
 
virtual void SetComputeCurrentValue (bool _arg)
 
virtual void SetNumberOfIterations (unsigned long _arg)
 
virtual void SetParam_A (double _arg)
 
virtual void SetParam_a (double _arg)
 
virtual void SetParam_alpha (double _arg)
 
virtual void SetParam_c (double _arg)
 
virtual void SetParam_gamma (double _arg)
 
void StartOptimization () override
 
void StopOptimization ()
 
- Public Member Functions inherited from itk::ScaledSingleValuedNonLinearOptimizer
const ParametersType & GetCurrentPosition () const override
 
virtual bool GetMaximize () const
 
virtual const ScaledCostFunctionTypeGetScaledCostFunction ()
 
virtual const ParametersType & GetScaledCurrentPosition ()
 
bool GetUseScales () const
 
virtual void InitializeScales ()
 
 ITK_DISALLOW_COPY_AND_MOVE (ScaledSingleValuedNonLinearOptimizer)
 
virtual void MaximizeOff ()
 
virtual void MaximizeOn ()
 
void SetCostFunction (CostFunctionType *costFunction) override
 
virtual void SetMaximize (bool _arg)
 
virtual void SetUseScales (bool arg)
 

Static Public Member Functions

static Pointer New ()
 
- Static Public Member Functions inherited from itk::ScaledSingleValuedNonLinearOptimizer
static Pointer New ()
 

Protected Member Functions

virtual double Compute_a (unsigned long k) const
 
virtual double Compute_c (unsigned long k) const
 
 FiniteDifferenceGradientDescentOptimizer ()
 
void PrintSelf (std::ostream &os, Indent indent) const override
 
 ~FiniteDifferenceGradientDescentOptimizer () override=default
 
- Protected Member Functions inherited from itk::ScaledSingleValuedNonLinearOptimizer
virtual void GetScaledDerivative (const ParametersType &parameters, DerivativeType &derivative) const
 
virtual MeasureType GetScaledValue (const ParametersType &parameters) const
 
virtual void GetScaledValueAndDerivative (const ParametersType &parameters, MeasureType &value, DerivativeType &derivative) const
 
void PrintSelf (std::ostream &os, Indent indent) const override
 
 ScaledSingleValuedNonLinearOptimizer ()
 
void SetCurrentPosition (const ParametersType &param) override
 
virtual void SetScaledCurrentPosition (const ParametersType &parameters)
 
 ~ScaledSingleValuedNonLinearOptimizer () override=default
 

Protected Attributes

bool m_ComputeCurrentValue { false }
 
DerivativeType m_Gradient {}
 
double m_GradientMagnitude { 0.0 }
 
double m_LearningRate { 0.0 }
 
- Protected Attributes inherited from itk::ScaledSingleValuedNonLinearOptimizer
ScaledCostFunctionPointer m_ScaledCostFunction {}
 
ParametersType m_ScaledCurrentPosition {}
 

Private Attributes

unsigned long m_CurrentIteration { 0 }
 
unsigned long m_NumberOfIterations { 100 }
 
double m_Param_A { 1.0 }
 
double m_Param_a { 1.0 }
 
double m_Param_alpha { 0.602 }
 
double m_Param_c { 1.0 }
 
double m_Param_gamma { 0.101 }
 
bool m_Stop { false }
 
StopConditionType m_StopCondition { MaximumNumberOfIterations }
 
double m_Value { 0.0 }
 

Member Typedef Documentation

◆ ConstPointer

◆ Pointer

◆ Self

◆ Superclass

Member Enumeration Documentation

◆ StopConditionType

Codes of stopping conditions

Enumerator
MaximumNumberOfIterations 
MetricError 

Definition at line 73 of file itkFiniteDifferenceGradientDescentOptimizer.h.

Constructor & Destructor Documentation

◆ FiniteDifferenceGradientDescentOptimizer()

itk::FiniteDifferenceGradientDescentOptimizer::FiniteDifferenceGradientDescentOptimizer ( )
protected

◆ ~FiniteDifferenceGradientDescentOptimizer()

itk::FiniteDifferenceGradientDescentOptimizer::~FiniteDifferenceGradientDescentOptimizer ( )
overrideprotecteddefault

Member Function Documentation

◆ AdvanceOneStep()

virtual void itk::FiniteDifferenceGradientDescentOptimizer::AdvanceOneStep ( )
virtual

Advance one step following the gradient direction.

◆ Compute_a()

virtual double itk::FiniteDifferenceGradientDescentOptimizer::Compute_a ( unsigned long k) const
protectedvirtual

◆ Compute_c()

virtual double itk::FiniteDifferenceGradientDescentOptimizer::Compute_c ( unsigned long k) const
protectedvirtual

◆ ComputeCurrentValueOff()

virtual void itk::FiniteDifferenceGradientDescentOptimizer::ComputeCurrentValueOff ( )
virtual

◆ ComputeCurrentValueOn()

virtual void itk::FiniteDifferenceGradientDescentOptimizer::ComputeCurrentValueOn ( )
virtual

◆ GetClassName()

virtual const char * itk::FiniteDifferenceGradientDescentOptimizer::GetClassName ( ) const
virtual

Run-time type information (and related methods).

Reimplemented from itk::ScaledSingleValuedNonLinearOptimizer.

Reimplemented in elastix::FiniteDifferenceGradientDescent< TElastix >.

◆ GetComputeCurrentValue()

virtual bool itk::FiniteDifferenceGradientDescentOptimizer::GetComputeCurrentValue ( ) const
virtual

◆ GetCurrentIteration()

virtual unsigned long itk::FiniteDifferenceGradientDescentOptimizer::GetCurrentIteration ( ) const
virtual

Get the current iteration number.

◆ GetGradientMagnitude()

virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetGradientMagnitude ( ) const
virtual

Get the CurrentStepLength, GradientMagnitude and LearningRate (a_k)

◆ GetLearningRate()

virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetLearningRate ( ) const
virtual

◆ GetNumberOfIterations()

virtual unsigned long itk::FiniteDifferenceGradientDescentOptimizer::GetNumberOfIterations ( ) const
virtual

Get the number of iterations.

◆ GetParam_A()

virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetParam_A ( ) const
virtual

◆ GetParam_a()

virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetParam_a ( ) const
virtual

◆ GetParam_alpha()

virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetParam_alpha ( ) const
virtual

◆ GetParam_c()

virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetParam_c ( ) const
virtual

◆ GetParam_gamma()

virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetParam_gamma ( ) const
virtual

◆ GetStopCondition()

virtual StopConditionType itk::FiniteDifferenceGradientDescentOptimizer::GetStopCondition ( ) const
virtual

Get Stop condition.

◆ GetValue()

virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetValue ( ) const
virtual

Get the current value.

◆ ITK_DISALLOW_COPY_AND_MOVE()

itk::FiniteDifferenceGradientDescentOptimizer::ITK_DISALLOW_COPY_AND_MOVE ( FiniteDifferenceGradientDescentOptimizer )

◆ New()

static Pointer itk::FiniteDifferenceGradientDescentOptimizer::New ( )
static

Method for creation through the object factory.

◆ PrintSelf()

void itk::FiniteDifferenceGradientDescentOptimizer::PrintSelf ( std::ostream & os,
Indent indent ) const
overrideprotected

PrintSelf method.

◆ ResumeOptimization()

void itk::FiniteDifferenceGradientDescentOptimizer::ResumeOptimization ( )

Resume previously stopped optimization with current parameters

See also
StopOptimization.

◆ SetComputeCurrentValue()

virtual void itk::FiniteDifferenceGradientDescentOptimizer::SetComputeCurrentValue ( bool _arg)
virtual

◆ SetNumberOfIterations()

virtual void itk::FiniteDifferenceGradientDescentOptimizer::SetNumberOfIterations ( unsigned long _arg)
virtual

Set the number of iterations.

◆ SetParam_A()

virtual void itk::FiniteDifferenceGradientDescentOptimizer::SetParam_A ( double _arg)
virtual

Set/Get A.

◆ SetParam_a()

virtual void itk::FiniteDifferenceGradientDescentOptimizer::SetParam_a ( double _arg)
virtual

Set/Get a.

◆ SetParam_alpha()

virtual void itk::FiniteDifferenceGradientDescentOptimizer::SetParam_alpha ( double _arg)
virtual

Set/Get alpha.

◆ SetParam_c()

virtual void itk::FiniteDifferenceGradientDescentOptimizer::SetParam_c ( double _arg)
virtual

Set/Get c.

◆ SetParam_gamma()

virtual void itk::FiniteDifferenceGradientDescentOptimizer::SetParam_gamma ( double _arg)
virtual

Set/Get gamma.

◆ StartOptimization()

void itk::FiniteDifferenceGradientDescentOptimizer::StartOptimization ( )
override

Start optimization.

◆ StopOptimization()

void itk::FiniteDifferenceGradientDescentOptimizer::StopOptimization ( )

Stop optimization.

See also
ResumeOptimization

Field Documentation

◆ m_ComputeCurrentValue

bool itk::FiniteDifferenceGradientDescentOptimizer::m_ComputeCurrentValue { false }
protected

Boolean that says if the current value of the metric has to be computed. This is not necessary for optimisation; just nice for progress information.

Definition at line 158 of file itkFiniteDifferenceGradientDescentOptimizer.h.

◆ m_CurrentIteration

unsigned long itk::FiniteDifferenceGradientDescentOptimizer::m_CurrentIteration { 0 }
private

◆ m_Gradient

DerivativeType itk::FiniteDifferenceGradientDescentOptimizer::m_Gradient {}
protected

◆ m_GradientMagnitude

double itk::FiniteDifferenceGradientDescentOptimizer::m_GradientMagnitude { 0.0 }
protected

◆ m_LearningRate

double itk::FiniteDifferenceGradientDescentOptimizer::m_LearningRate { 0.0 }
protected

◆ m_NumberOfIterations

unsigned long itk::FiniteDifferenceGradientDescentOptimizer::m_NumberOfIterations { 100 }
private

◆ m_Param_A

double itk::FiniteDifferenceGradientDescentOptimizer::m_Param_A { 1.0 }
private

◆ m_Param_a

double itk::FiniteDifferenceGradientDescentOptimizer::m_Param_a { 1.0 }
private

Parameters, as described by Spall.

Definition at line 176 of file itkFiniteDifferenceGradientDescentOptimizer.h.

◆ m_Param_alpha

double itk::FiniteDifferenceGradientDescentOptimizer::m_Param_alpha { 0.602 }
private

◆ m_Param_c

double itk::FiniteDifferenceGradientDescentOptimizer::m_Param_c { 1.0 }
private

◆ m_Param_gamma

double itk::FiniteDifferenceGradientDescentOptimizer::m_Param_gamma { 0.101 }
private

◆ m_Stop

bool itk::FiniteDifferenceGradientDescentOptimizer::m_Stop { false }
private

Private member variables.

Definition at line 169 of file itkFiniteDifferenceGradientDescentOptimizer.h.

◆ m_StopCondition

StopConditionType itk::FiniteDifferenceGradientDescentOptimizer::m_StopCondition { MaximumNumberOfIterations }
private

◆ m_Value

double itk::FiniteDifferenceGradientDescentOptimizer::m_Value { 0.0 }
private


Generated on 2024-07-17 for elastix by doxygen 1.11.0 (9b424b03c9833626cd435af22a444888fbbb192d) elastix logo