This post contains a video on Hyperparameter Tuning with XGBoost. The methods I cover here include Grid Search, Randomized Search, and Bayesian Optimization. The discussion is at an easy-to-follow, high level. I also include in this post some links you may find useful, if you’re curious to learn more.

hyperparameter tuning with xgboost

3 Methods for Hyperparameter Tuning with XGBoost – image by author

Video - "3 Methods for Hyperparameter Tuning with XGBoost"

Here is the video, available on YouTube:

Additional Links

The notebook presented in the video can be found here.

Click here to access the home page of the XGBoost project. 

The 2016 paper, introducing XGBoost, can be downloaded here

You can also look at my blog articles, where is describe the mathematical details of the Gradient Boosting Algorithm, as well as how this algorithm can be implemented in Python from scratch.

Hi I'm Michael Attard, a Data Scientist with a background in Astrophysics. I enjoy helping others on their journey to learn more about machine learning, and how it can be applied in industry.

0 0 votes
Article Rating
Subscribe
Notify of
guest

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x
Newsletter Signup