Accelerating boosting via accelerated greedy coordinate descent

Abstract

We exploit the connection between boosting and greedy coordinate optimization to produce new accelerated boosting methods. Specifically, we look at increasing block sizes, better selection rules, and momentum-type acceleration. Numerical results show training convergence gains over several data sets. The code is made publicly available.

Xiaomeng Ju
Xiaomeng Ju
PhD candidate in Statistics

I am a PhD student in Statistics at the University of British Columbia, advised by Professor Matias Salibian-Barrera. She received her BSc in Statistics from Renmin University of China, and MA in Statistics from University of Michigan. Xiaomeng’s research is centred on computational statistics with a special focus on robust statistics and functional data. Her ongoing thesis work develops gradient boosting methods for regression problems with complex data.