g2o
Loading...
Searching...
No Matches
Public Types | Public Member Functions | Static Public Member Functions | Protected Attributes | Private Attributes | List of all members
g2o::OptimizationAlgorithmDogleg Class Reference

Implementation of Powell's Dogleg Algorithm. More...

#include <optimization_algorithm_dogleg.h>

Inheritance diagram for g2o::OptimizationAlgorithmDogleg:
Inheritance graph
[legend]
Collaboration diagram for g2o::OptimizationAlgorithmDogleg:
Collaboration graph
[legend]

Public Types

enum  { STEP_UNDEFINED , STEP_SD , STEP_GN , STEP_DL }
 type of the step to take More...
 

Public Member Functions

 OptimizationAlgorithmDogleg (std::unique_ptr< BlockSolverBase > solver)
 
virtual ~OptimizationAlgorithmDogleg ()
 
virtual SolverResult solve (int iteration, bool online=false)
 
virtual void printVerbose (std::ostream &os) const
 
int lastStep () const
 return the type of the last step taken by the algorithm
 
double trustRegion () const
 return the diameter of the trust region
 
- Public Member Functions inherited from g2o::OptimizationAlgorithmWithHessian
 OptimizationAlgorithmWithHessian (Solver &solver)
 
virtual ~OptimizationAlgorithmWithHessian ()
 
virtual bool init (bool online=false)
 
virtual bool computeMarginals (SparseBlockMatrix< MatrixX > &spinv, const std::vector< std::pair< int, int > > &blockIndices)
 
virtual bool buildLinearStructure ()
 
virtual void updateLinearSystem ()
 
virtual bool updateStructure (const std::vector< HyperGraph::Vertex * > &vset, const HyperGraph::EdgeSet &edges)
 
Solversolver ()
 return the underlying solver used to solve the linear system
 
virtual void setWriteDebug (bool writeDebug)
 
virtual bool writeDebug () const
 
- Public Member Functions inherited from g2o::OptimizationAlgorithm
 OptimizationAlgorithm ()
 
virtual ~OptimizationAlgorithm ()
 
const SparseOptimizeroptimizer () const
 return the optimizer operating on
 
SparseOptimizeroptimizer ()
 
void setOptimizer (SparseOptimizer *optimizer)
 
const PropertyMapproperties () const
 return the properties of the solver
 
bool updatePropertiesFromString (const std::string &propString)
 
void printProperties (std::ostream &os) const
 

Static Public Member Functions

static const char * stepType2Str (int stepType)
 convert the type into an integer
 

Protected Attributes

Property< int > * _maxTrialsAfterFailure
 
Property< double > * _userDeltaInit
 
Property< double > * _initialLambda
 
Property< double > * _lamdbaFactor
 
VectorX _hsd
 steepest decent step
 
VectorX _hdl
 final dogleg step
 
VectorX _auxVector
 
double _currentLambda
 the damping factor to force positive definite matrix
 
double _delta
 trust region
 
int _lastStep
 type of the step taken by the algorithm
 
bool _wasPDInAllIterations
 
int _lastNumTries
 
- Protected Attributes inherited from g2o::OptimizationAlgorithmWithHessian
Solver_solver
 
Property< bool > * _writeDebug
 
- Protected Attributes inherited from g2o::OptimizationAlgorithm
SparseOptimizer_optimizer
 the optimizer the solver is working on
 
PropertyMap _properties
 

Private Attributes

std::unique_ptr< BlockSolverBasem_solver
 

Detailed Description

Implementation of Powell's Dogleg Algorithm.

Definition at line 42 of file optimization_algorithm_dogleg.h.

Member Enumeration Documentation

◆ anonymous enum

anonymous enum

Constructor & Destructor Documentation

◆ OptimizationAlgorithmDogleg()

g2o::OptimizationAlgorithmDogleg::OptimizationAlgorithmDogleg ( std::unique_ptr< BlockSolverBase solver)
explicit

construct the Dogleg algorithm, which will use the given Solver for solving the linearized system.

Definition at line 42 of file optimization_algorithm_dogleg.cpp.

45 m_solver{std::move(solver)} {
47 _properties.makeProperty<Property<double>>("initialDelta", (double)1e4);
49 _properties.makeProperty<Property<int>>("maxTrialsAfterFailure", 100);
51 _properties.makeProperty<Property<double>>("initialLambda", (double)1e-7);
53 _properties.makeProperty<Property<double>>("lambdaFactor", 10.);
57 _lastNumTries = 0;
58 _currentLambda = 0.;
59}
int _lastStep
type of the step taken by the algorithm
double _currentLambda
the damping factor to force positive definite matrix
std::unique_ptr< BlockSolverBase > m_solver
Solver & solver()
return the underlying solver used to solve the linear system
P * makeProperty(const std::string &name_, const typename P::ValueType &v)
Definition property.h:116
const T & value() const
Definition property.h:59

References _currentLambda, _delta, _initialLambda, _lamdbaFactor, _lastNumTries, _lastStep, _maxTrialsAfterFailure, g2o::OptimizationAlgorithm::_properties, _userDeltaInit, _wasPDInAllIterations, g2o::PropertyMap::makeProperty(), STEP_UNDEFINED, and g2o::Property< T >::value().

◆ ~OptimizationAlgorithmDogleg()

g2o::OptimizationAlgorithmDogleg::~OptimizationAlgorithmDogleg ( )
virtual

Definition at line 61 of file optimization_algorithm_dogleg.cpp.

61{}

Member Function Documentation

◆ lastStep()

int g2o::OptimizationAlgorithmDogleg::lastStep ( ) const
inline

return the type of the last step taken by the algorithm

Definition at line 61 of file optimization_algorithm_dogleg.h.

61{ return _lastStep; }

◆ printVerbose()

void g2o::OptimizationAlgorithmDogleg::printVerbose ( std::ostream &  os) const
virtual

called by the optimizer if verbose. re-implement, if you want to print something

Reimplemented from g2o::OptimizationAlgorithm.

Definition at line 219 of file optimization_algorithm_dogleg.cpp.

219 {
220 os << "\t Delta= " << _delta << "\t step= " << stepType2Str(_lastStep)
221 << "\t tries= " << _lastNumTries;
222 if (!_wasPDInAllIterations) os << "\t lambda= " << _currentLambda;
223}
static const char * stepType2Str(int stepType)
convert the type into an integer

References _currentLambda, _delta, _lastNumTries, _lastStep, _wasPDInAllIterations, and stepType2Str().

◆ solve()

OptimizationAlgorithm::SolverResult g2o::OptimizationAlgorithmDogleg::solve ( int  iteration,
bool  online = false 
)
virtual

Solve one iteration. The SparseOptimizer running on-top will call this for the given number of iterations.

Parameters
iterationindicates the current iteration

Implements g2o::OptimizationAlgorithm.

Definition at line 63 of file optimization_algorithm_dogleg.cpp.

64 {
65 assert(_optimizer && "_optimizer not set");
66 assert(_solver.optimizer() == _optimizer &&
67 "underlying linear solver operates on different graph");
68
69 BlockSolverBase& blockSolver = static_cast<BlockSolverBase&>(_solver);
70
71 if (iteration == 0 &&
72 !online) { // built up the CCS structure, here due to easy time measure
73 bool ok = _solver.buildStructure();
74 if (!ok) {
75 G2O_WARN("{}: Failure while building CCS structure", __PRETTY_FUNCTION__);
76 return OptimizationAlgorithm::Fail;
77 }
78
79 // init some members to the current size of the problem
80 _hsd.resize(_solver.vectorSize());
81 _hdl.resize(_solver.vectorSize());
86 }
87
88 double t = get_monotonic_time();
90 G2OBatchStatistics* globalStats = G2OBatchStatistics::globalStats();
91 if (globalStats) {
92 globalStats->timeResiduals = get_monotonic_time() - t;
94 }
95
96 double currentChi = _optimizer->activeRobustChi2();
97
99 if (globalStats) {
100 globalStats->timeQuadraticForm = get_monotonic_time() - t;
101 }
102
103 VectorX::ConstMapType b(_solver.b(), _solver.vectorSize());
104
105 // compute alpha
106 _auxVector.setZero();
107 blockSolver.multiplyHessian(_auxVector.data(), _solver.b());
108 double bNormSquared = b.squaredNorm();
109 double alpha = bNormSquared / _auxVector.dot(b);
110
111 _hsd = alpha * b;
112 double hsdNorm = _hsd.norm();
113 double hgnNorm = -1.;
114
115 bool solvedGaussNewton = false;
116 bool goodStep = false;
117 int& numTries = _lastNumTries;
118 numTries = 0;
119 do {
120 ++numTries;
121
122 if (!solvedGaussNewton) {
123 const double minLambda = cst(1e-12);
124 const double maxLambda = cst(1e3);
125 solvedGaussNewton = true;
126 // apply a damping factor to enforce positive definite Hessian, if the
127 // matrix appeared to be not positive definite in at least one iteration
128 // before. We apply a damping factor to obtain a PD matrix.
129 bool solverOk = false;
130 while (!solverOk) {
133 true); // add _currentLambda to the diagonal
134 solverOk = _solver.solve();
138 // simple strategy to control the damping factor
139 if (solverOk) {
141 std::max(minLambda,
142 _currentLambda / (cst(0.5) * _lamdbaFactor->value()));
143 } else {
145 if (_currentLambda > maxLambda) {
146 _currentLambda = maxLambda;
147 return Fail;
148 }
149 }
150 }
151 }
152 hgnNorm = VectorX::ConstMapType(_solver.x(), _solver.vectorSize()).norm();
153 }
154
155 VectorX::ConstMapType hgn(_solver.x(), _solver.vectorSize());
156 assert(hgnNorm >= 0. && "Norm of the GN step is not computed");
157
158 if (hgnNorm < _delta) {
159 _hdl = hgn;
161 } else if (hsdNorm > _delta) {
162 _hdl = _delta / hsdNorm * _hsd;
164 } else {
165 _auxVector = hgn - _hsd; // b - a
166 double c = _hsd.dot(_auxVector);
167 double bmaSquaredNorm = _auxVector.squaredNorm();
168 double beta;
169 if (c <= 0.)
170 beta = (-c + sqrt(c * c + bmaSquaredNorm *
171 (_delta * _delta - _hsd.squaredNorm()))) /
172 bmaSquaredNorm;
173 else {
174 double hsdSqrNorm = _hsd.squaredNorm();
175 beta =
176 (_delta * _delta - hsdSqrNorm) /
177 (c + sqrt(c * c + bmaSquaredNorm * (_delta * _delta - hsdSqrNorm)));
178 }
179 assert(beta > 0. && beta < 1 && "Error while computing beta");
180 _hdl = _hsd + beta * (hgn - _hsd);
182 assert(_hdl.norm() < _delta + 1e-5 &&
183 "Computed step does not correspond to the trust region");
184 }
185
186 // compute the linear gain
187 _auxVector.setZero();
188 blockSolver.multiplyHessian(_auxVector.data(), _hdl.data());
189 double linearGain = -1 * (_auxVector.dot(_hdl)) + 2 * (b.dot(_hdl));
190
191 // apply the update and see what happens
192 _optimizer->push();
193 _optimizer->update(_hdl.data());
195 double newChi = _optimizer->activeRobustChi2();
196 double nonLinearGain = currentChi - newChi;
197 if (fabs(linearGain) < 1e-12) linearGain = cst(1e-12);
198 double rho = nonLinearGain / linearGain;
199 // cerr << PVAR(nonLinearGain) << " " << PVAR(linearGain) << " " <<
200 // PVAR(rho) << endl;
201 if (rho > 0) { // step is good and will be accepted
203 goodStep = true;
204 } else { // recover previous state
205 _optimizer->pop();
206 }
207
208 // update trust region based on the step quality
209 if (rho > 0.75)
210 _delta = std::max<double>(_delta, 3 * _hdl.norm());
211 else if (rho < 0.25)
212 _delta *= 0.5;
213 } while (!goodStep && numTries < _maxTrialsAfterFailure->value());
214 if (numTries == _maxTrialsAfterFailure->value() || !goodStep)
215 return Terminate;
216 return OK;
217}
SparseOptimizer * _optimizer
the optimizer the solver is working on
double * b()
return b, the right hand side of the system
Definition solver.h:101
virtual void restoreDiagonal()=0
virtual bool buildStructure(bool zeroBlocks=false)=0
size_t vectorSize() const
return the size of the solution vector (x) and b
Definition solver.h:105
double * x()
return x, the solution vector
Definition solver.h:98
virtual bool setLambda(double lambda, bool backup=false)=0
virtual bool solve()=0
virtual bool buildSystem()=0
SparseOptimizer * optimizer() const
the optimizer (graph) on which the solver works
Definition solver.h:108
void push(SparseOptimizer::VertexContainer &vlist)
push the estimate of a subset of the variables onto a stack
void pop(SparseOptimizer::VertexContainer &vlist)
pop (restore) the estimate a subset of the variables from the stack
void update(const double *update)
double activeRobustChi2() const
void discardTop(SparseOptimizer::VertexContainer &vlist)
#define G2O_WARN(...)
Definition logger.h:88
#define __PRETTY_FUNCTION__
Definition macros.h:90
Jet< T, N > sqrt(const Jet< T, N > &f)
Definition jet.h:444
constexpr double cst(long double v)
Definition misc.h:60
double get_monotonic_time()
Definition timeutil.cpp:43
static G2OBatchStatistics * globalStats()
Definition batch_stats.h:77

References __PRETTY_FUNCTION__, _auxVector, _currentLambda, _delta, _hdl, _hsd, _initialLambda, _lamdbaFactor, _lastNumTries, _lastStep, _maxTrialsAfterFailure, g2o::OptimizationAlgorithm::_optimizer, g2o::OptimizationAlgorithmWithHessian::_solver, _userDeltaInit, _wasPDInAllIterations, g2o::SparseOptimizer::activeRobustChi2(), g2o::Solver::b(), g2o::Solver::buildStructure(), g2o::Solver::buildSystem(), g2o::SparseOptimizer::computeActiveErrors(), g2o::cst(), g2o::SparseOptimizer::discardTop(), G2O_WARN, g2o::get_monotonic_time(), g2o::G2OBatchStatistics::globalStats(), g2o::BlockSolverBase::multiplyHessian(), OK, g2o::Solver::optimizer(), g2o::SparseOptimizer::pop(), g2o::SparseOptimizer::push(), g2o::Solver::restoreDiagonal(), g2o::Solver::setLambda(), g2o::Solver::solve(), STEP_DL, STEP_GN, STEP_SD, Terminate, g2o::G2OBatchStatistics::timeQuadraticForm, g2o::G2OBatchStatistics::timeResiduals, g2o::SparseOptimizer::update(), g2o::Property< T >::value(), g2o::Solver::vectorSize(), and g2o::Solver::x().

◆ stepType2Str()

const char * g2o::OptimizationAlgorithmDogleg::stepType2Str ( int  stepType)
static

convert the type into an integer

Definition at line 225 of file optimization_algorithm_dogleg.cpp.

225 {
226 switch (stepType) {
227 case STEP_SD:
228 return "Descent";
229 case STEP_GN:
230 return "GN";
231 case STEP_DL:
232 return "Dogleg";
233 default:
234 return "Undefined";
235 }
236}

References STEP_DL, STEP_GN, and STEP_SD.

Referenced by printVerbose().

◆ trustRegion()

double g2o::OptimizationAlgorithmDogleg::trustRegion ( ) const
inline

return the diameter of the trust region

Definition at line 63 of file optimization_algorithm_dogleg.h.

63{ return _delta; }

Member Data Documentation

◆ _auxVector

VectorX g2o::OptimizationAlgorithmDogleg::_auxVector
protected

auxiliary vector used to perform multiplications or other stuff

Definition at line 78 of file optimization_algorithm_dogleg.h.

Referenced by solve().

◆ _currentLambda

double g2o::OptimizationAlgorithmDogleg::_currentLambda
protected

the damping factor to force positive definite matrix

Definition at line 82 of file optimization_algorithm_dogleg.h.

Referenced by OptimizationAlgorithmDogleg(), printVerbose(), and solve().

◆ _delta

double g2o::OptimizationAlgorithmDogleg::_delta
protected

trust region

Definition at line 83 of file optimization_algorithm_dogleg.h.

Referenced by OptimizationAlgorithmDogleg(), printVerbose(), and solve().

◆ _hdl

VectorX g2o::OptimizationAlgorithmDogleg::_hdl
protected

final dogleg step

Definition at line 77 of file optimization_algorithm_dogleg.h.

Referenced by solve().

◆ _hsd

VectorX g2o::OptimizationAlgorithmDogleg::_hsd
protected

steepest decent step

Definition at line 76 of file optimization_algorithm_dogleg.h.

Referenced by solve().

◆ _initialLambda

Property<double>* g2o::OptimizationAlgorithmDogleg::_initialLambda
protected

Definition at line 73 of file optimization_algorithm_dogleg.h.

Referenced by OptimizationAlgorithmDogleg(), and solve().

◆ _lamdbaFactor

Property<double>* g2o::OptimizationAlgorithmDogleg::_lamdbaFactor
protected

Definition at line 74 of file optimization_algorithm_dogleg.h.

Referenced by OptimizationAlgorithmDogleg(), and solve().

◆ _lastNumTries

int g2o::OptimizationAlgorithmDogleg::_lastNumTries
protected

◆ _lastStep

int g2o::OptimizationAlgorithmDogleg::_lastStep
protected

type of the step taken by the algorithm

Definition at line 84 of file optimization_algorithm_dogleg.h.

Referenced by OptimizationAlgorithmDogleg(), printVerbose(), and solve().

◆ _maxTrialsAfterFailure

Property<int>* g2o::OptimizationAlgorithmDogleg::_maxTrialsAfterFailure
protected

Definition at line 70 of file optimization_algorithm_dogleg.h.

Referenced by OptimizationAlgorithmDogleg(), and solve().

◆ _userDeltaInit

Property<double>* g2o::OptimizationAlgorithmDogleg::_userDeltaInit
protected

Definition at line 71 of file optimization_algorithm_dogleg.h.

Referenced by OptimizationAlgorithmDogleg(), and solve().

◆ _wasPDInAllIterations

bool g2o::OptimizationAlgorithmDogleg::_wasPDInAllIterations
protected

the matrix we solve was positive definite in all iterations -> if not apply damping

Definition at line 85 of file optimization_algorithm_dogleg.h.

Referenced by OptimizationAlgorithmDogleg(), printVerbose(), and solve().

◆ m_solver

std::unique_ptr<BlockSolverBase> g2o::OptimizationAlgorithmDogleg::m_solver
private

Definition at line 90 of file optimization_algorithm_dogleg.h.


The documentation for this class was generated from the following files: