boosting系列:
Adaboost:利用前一轮迭代弱学习器的误差率来更新训练集的权重,这样一轮轮的迭代下去
https://snaildove.github.io/2018/10/01/8.Booting-Methods_LiHang-Statistical-Learning-Methods/
GBDT:https://www.cnblogs.com/pinard/p/6140514.html
XGBoost:https://snaildove.github.io/2018/10/02/get-started-XGBoost/