The "gradient boosting" is an ensemble method that generalizes boosting by providing the opportunity of use other loss functions ("standard" boosting uses implicitly an exponential loss function).
These slides show the ins and outs of the method. Gradient boosting for regression is detailed initially. The classification problem is presented thereafter.
The solutions implemented in the packages for R and Python are studied.
Keywords: boosting, regression tree, package gbm, package mboost, package xgboost, R, Python, package scikit-learn, sklearn
Slides: Gradient Boosting
References:
R. Rakotomalala, "Bagging, Random Forest, Boosting", December 2015.
Natekin A., Knoll A., "Gradient boosting machines, a tutorial", in Frontiers in Neurorobotics, December 2013.