Learn practical skills, build real-world projects, and advance your career

Gradient Boosting Machines (GBMs) with XGBoost

This tutorial is a part of Machine Learning with Python: Zero to GBMs and Zero to Data Science Bootcamp by Jovian

alt

The following topics are covered in this tutorial:

  • Downloading a real-world dataset from a Kaggle competition
  • Performing feature engineering and prepare the dataset for training
  • Training and interpreting a gradient boosting model using XGBoost
  • Training with KFold cross validation and ensembling results
  • Configuring the gradient boosting model and tuning hyperparamters

Let's begin by installing the required libraries.

#restart the kernel after installation
!pip install numpy pandas-profiling matplotlib seaborn --quiet
!pip install jovian opendatasets xgboost graphviz lightgbm scikit-learn xgboost lightgbm --upgrade --quiet
|████████████████████████████████| 173.5 MB 67 kB/s |████████████████████████████████| 2.0 MB 52.2 MB/s |████████████████████████████████| 23.2 MB 1.3 MB/s