In the last posts, we have been talking about feature selection. We started by exploring univariate feature selection and then we moved to model-based approaches for feature selection. Today we explore a different type of feature selection technique: sequential feature selection.
Sequential feature selection learns a subset of relevant features by sequentially adding (or removing) features according to the performance of the prediction model. The application of sequential feature selection is almost like wearing warm clothes in Canada. When you go to the street you add all the layers of cloth you can, when you enter a building you remove all the layers of cloth you can.
The most common types of sequential feature selection are forward feature selection and backward sequential selection. Forward feature selection starts with zero features, evaluates all feature subsets with exactly one feature, and chooses the one with the best performance. Backward feature selection starts with all features and sequentially removes a feature whose removal maximizes performance improvement.
In the following notebook, you have a practical application of sequential feature selection. You can access the data and the code on GitHub.