Ensemble Learning



Ensembles of Neural Networks (On going work)


Ensemble learning is a technique of machine learning that consist on the creation of a strong model by making use of multiple models called weak learners; to obtain a prediction accuracy higher than the one that could be obtain by any of the weak learners alone.
This work focuses on ensembles of neural networks, by carefully understanding and nicely visualizing the main research contributions that has been done on this field. Find the landing page here , or go to direct resources:


(1) - Justifying Neural Network Ensembling

  (2) - Ensemble strategies I - Distribution of data (TBI)

(3) - Ensemble strategies II - Outputs aggregation

   (3.1) - Aggregate outputs from individually trained models
   (3.2) - Ensemble awareness during training

  (4) - Ensemble strategies III - Accelerate ensemble training (TBI)

   (4.1) - Snapshots - Train 1, get M for free
   (4.2) - Function preserving transformations (TBI)
    (4.2.1) - Net2Net (TBI)
    (4.2.2) - Netwok Mophism (TBI)

Sections and papers they cover
Contextualize Deep Ensemble Learning
Why Ensembles? [1]
Beliefs aggreagation [2]
Ensemble Awareness [3]
How to accelerate the training of ensembles?
Reduce training cost [4]
Mother Nets [5]
How to accelerate the inference?
Reduced number of models used [6]
Share parameters across models - TreeNets [4]
Distill knowledge - Model Compression [7] [8]
Extra Reading
Function Preserving Transformations [9]
Network Morphism [10]