Hauptnavigation

Buschjaeger/etal/2020c: Generalized Negative Correlation Learning for Deep Ensembling

Bibtype Misc
Bibkey Buschjaeger/etal/2020c
Author Sebastian Buschjäger and Lukas Pfahler and Katharina Morik
Ls8autor Buschjäger, Sebastian
Morik, Katharina
Pfahler, Lukas
Title Generalized Negative Correlation Learning for Deep Ensembling
Abstract Ensemble algorithms offer state of the art performance in many machine learning applications. A common explanation for their excellent performance is due to the bias-variance decomposition of the mean squared error which shows that the algorithm's error can be decomposed into its bias and variance. Both quantities are often opposed to each other and ensembles offer an effective way to manage them as they reduce the variance through a diverse set of base learners while keeping the bias low at the same time. Even though there have been numerous works on decomposing other loss functions, the exact mathematical connection is rarely exploited explicitly for ensembling, but merely used as a guiding principle. In this paper, we formulate a generalized bias-variance decomposition for arbitrary twice differentiable loss functions and study it in the context of Deep Learning. We use this decomposition to derive a Generalized Negative Correlation Learning (GNCL) algorithm which offers explicit control over the ensemble's diversity and smoothly interpolates between the two extremes of independent training and the joint training of the ensemble. We show how GNCL encapsulates many previous works and discuss under which circumstances training of an ensemble of Neural Networks might fail and what ensembling method should be favored depending on the choice of the individual networks.
Year 2020
Projekt SFB876-A1
Url https://arxiv.org/abs/2011.02952
Publicfile