ceres 损失函数loss_function小结

0. 损失函数的意义

最小二乘问题的输入数据可能包含异常值(错误测量得到的),使用损失函数减少这部分数据的影响。
LossFunction可以让大的残差的权重降低,从而对 最终的优化结果没有太大的影响.

1. ceres中损失核函数的用法

构建核函数

ceres::Problem problem;
ceres::LossFunction *loss_function;                           // 损失核函数 
//loss_function = new ceres::HuberLoss(0.1);                  // Huber核函数
loss_function = new ceres::CauchyLoss(0.1);                   // 柯西核函数   

定义ceres的损失函数0.1表示残差大于0.1的点,权重降低,具体效果根据核函数而定,小于0.1,则认为正常,不做特殊处理。

添加核函数

// 添加残差(代价函数  核函数   优化变量) 
problem.AddResidualBlock(cost_function, loss_function, para_q, para_t);

2. 各个损失函数的趋势图

在这里插入图片描述

3. Ceres内嵌的loss functions原理:

class TrivialLoss: ρ ( s ) = s \rho(s)=s ρ(s)=s
class HuberLoss: ρ ( s ) = { s s ⩽ 1 2 s − 1 s > 1 \rho(s)=\begin{cases} s& s \leqslant 1\\ 2\sqrt s-1& s>1 \end{cases} ρ(s)={ s2s 1s1s>1

class SoftOneLoss: ρ ( s ) = 2 ( 1 + s − 1 ) \rho(s)=2(\sqrt{1+s}-1) ρ(s)=2(1+s 1)
class CauchyLoss: ρ ( s ) = l o g ( 1 + s ) \rho(s)=log(1+s) ρ(s)=log(1+s)
class ArctanLoss: ρ ( s ) = a r c t a n ( s ) \rho(s)=arctan(s) ρ(s)=arctan(s)
class TolerantLoss: ρ ( s , a , b ) = b log ⁡ ( 1 + e ( s − a ) / b ) − b log ⁡ ( 1 + e − a / b ) \rho(s,a,b)=b \log(1+e^{(s-a)/b})-b\log(1+e^{-a/b}) ρ(s,a,b)=blog(1+e(sa)/b)blog(1+ea/b)

以CauchyLoss方法为例,其头文件为:

// Inspired by the Cauchy distribution
//   rho(s) = log(1 + s).
// At s = 0: rho = [0, 1, -1].
class CERES_EXPORT CauchyLoss : public LossFunction {
    
    
 public:
  explicit CauchyLoss(double a) : b_(a * a), c_(1 / b_) {
    
    }
  void Evaluate(double, double*) const override;
//可以看出CauchyLoss()中的参数为尺度参数。
 private:
  // b = a^2.
  const double b_;
  // c = 1 / a^2.
  const double c_;
};

具体实现为:

扫描二维码关注公众号,回复: 15265792 查看本文章
void CauchyLoss::Evaluate(double s, double rho[3]) const {
    
    
  const double sum = 1.0 + s * c_;
  const double inv = 1.0 / sum;
  // 'sum' and 'inv' are always positive, assuming that 's' is.
  rho[0] = b_ * log(sum);
  rho[1] = std::max(std::numeric_limits<double>::min(), inv);
  rho[2] = - c_ * (inv * inv);
}

参考

https://blog.csdn.net/qq_42700518/article/details/105898222
https://blog.csdn.net/weishaodong/article/details/105876633
https://zhaohailong.blog.csdn.net/article/details/123850009
http://ceres-solver.org/nnls_modeling.html#instances

猜你喜欢

转载自blog.csdn.net/weixin_45626706/article/details/127729830