Learn practical skills, build real-world projects, and advance your career

Feature Selection Techniques

Reference : https://www.youtube.com/watch?v=k-EpAMjw6AE

In this notebook, I have tried to cover most of the common techniques for feature selections that has been tought on the youtube live session whose link is mentioned above as a reference link.

What is features selection ?

      In machine learning and statistics, feature selection, also known as variable selection, attribute selection or variable subset selection, is the process of selecting a subset of relevant features (variables, predictors) for use in model construction. Feature selection techniques are used for several reasons:
  • simplification of models to make them easier to interpret by researchers/users
  • shorter training times,
  • to avoid the curse of dimensionality,
  • enhanced generalization by reducing overfitting (formally, reduction of variance) more

List of most common and widely use features selection selection techniquest

There are various features selection techniques that could be used based on nature of the data. Some of them are as follows:

  • Univariate Selection
  • Constaint Variance

Data Set:

We are using mobile dataset.csv which is available on a github repository https://github.com/s-4-m-a-n/Feature-Engineering-Live-sessions/commits?author=krishnaik06