From the course: NLP with Python for Machine Learning Essential Training

Unlock the full course today

Join today to access over 22,800 courses taught by industry experts or purchase this course individually.

Introducing gradient boosting

Introducing gradient boosting - Python Tutorial

From the course: NLP with Python for Machine Learning Essential Training

Start my 1-month free trial

Introducing gradient boosting

- [Instructor] Over the next few lessons, we're going to go through a similar process as we did in the last few lessons, but we're going to do it with a new machine learning classifier. This one is called gradient boosting. It has some similarities to random forest, but some key differences as well. After we introduce gradient boosting in this lesson, over the next few lessons we will implement it and explore different parameters just like we did with random forest. So gradient boosting is also an ensemble method, just like random forest. Just a quick review, the ensemble method is a technique that created multiple models and then combines them to produce better results than any of the single models individually. Gradient boosting is an ensemble method that takes an iterative approach to combining weak learners to create a strong learner by focusing on the mistakes of prior iterations. So in the last lesson, we learned about how random forest builds a certain number of fully-grown…

Contents