- QuantLib
- NonLinearLeastSquare
 
Non-linear least-square method. More...
#include <ql/math/optimization/leastsquare.hpp>
| Public Member Functions | |
| NonLinearLeastSquare (Constraint &c, Real accuracy=1e-4, Size maxiter=100) | |
| Default constructor. | |
| NonLinearLeastSquare (Constraint &c, Real accuracy, Size maxiter, boost::shared_ptr< OptimizationMethod > om) | |
| Default constructor. | |
| ~NonLinearLeastSquare () | |
| Destructor. | |
| Array & | perform (LeastSquareProblem &lsProblem) | 
| Solve least square problem using numerix solver. | |
| void | setInitialValue (const Array &initialValue) | 
| Array & | results () | 
| return the results | |
| Real | residualNorm () | 
| return the least square residual norm | |
| Real | lastValue () | 
| return last function value | |
| Integer | exitFlag () | 
| return exit flag | |
| Integer | iterationsNumber () | 
| return the performed number of iterations | |
Non-linear least-square method.
Using a given optimization algorithm (default is conjugate gradient),
![\[ min \{ r(x) : x in R^n \} \]](form_228.png) 
where  is the Euclidean norm of
 is the Euclidean norm of  for some vector-valued function
 for some vector-valued function  from
 from  to
 to  ,
, 
![\[ f = (f_1, ..., f_m) \]](form_233.png) 
 with  where
 where  is the vector of target data and
 is the vector of target data and  is a scalar function.
 is a scalar function.
Assuming the differentiability of  , the gradient of
, the gradient of  is defined by
 is defined by 
![\[ grad r(x) = f'(x)^t.f(x) \]](form_236.png)