Those words are often used in the same texts/tutorials. Some people seem to use them as synonyms.
Table of Contents
Those things are not the same. They are not even similar. Sure, we see them used in the same context, but they describe two different steps of a single machine learning process.
Bootstrapping
Bootstrapping is a method of sample selection. The formal definition describes it as “random sampling with replacement”. Nevermind, let’s forget the definition for a while and build intuition around this term
In short, it allows us to choose duplicates while sampling (for example when selecting observations to be used for training). It may be useful when we have a small dataset, but the algorithm requires many data. Don’t get too excited. It won’t magically let you successfully use deep learning when you have only 10 examples in the training set.
Get Weekly AI Implementation Insights
Join engineering leaders who receive my analysis of common AI production failures and how to prevent them. No fluff, just actionable techniques.
Bagging
Now, we can move on to “bagging.” Bagging is a technique of fitting multiple classifiers and creating one ensembles model out of them.
Each one of the classifiers gets a different training set, and that is why words “bootstrapping” and “bagging” are often used together. The dataset for every classifier may be generated using bootstrapping.
Bootstrapping and bagging
In Scikit-learn the problem is nicely encapsulated (and not so nicely generalized). We have the sklearn.ensemble.BaggingClassifier
classifier.
BaggingClassifier in its default configuration uses bootstrapping to choose samples for the training set of every classifier, but it can be configured to choose a subset of features randomly or to use random sampling without replacement.