Accelerating boosting via accelerated greedy coordinate descent

Abstract

We exploit the connection between boosting and greedy coordinate optimization to produce new accelerated boosting methods. Specifically, we look at increasing block sizes, better selection rules, and momentum-type acceleration. Numerical results show training convergence gains over several data sets. The code is made publicly available.

Xiaomeng Ju
Xiaomeng Ju
Postdoctoral research fellow in Biostatistics

I am a postdoctoral research fellow in the Division of Biostatistics, at the New York University, Grossman School of Medicine, advised by Professor Matias Salibian-Barrera. She received her BSc in Statistics from Renmin University of China, and MA in Statistics from University of Michigan. Xiaomeng’s research is centred on computational statistics with a special focus on robust statistics and functional data. Her ongoing thesis work develops gradient boosting methods for regression problems with complex data.