Emerging Topics Community: Return to Trees, Part 3: Random Forest
Listen now
Description
Building on the discussion of individual decision trees in the prior episode, Shea and Anders shift to one of today’s most popular ensemble models, the Random Forest. At first glance, the algorithm may seem like a brute force approach of simply running hundreds or thousands of decision trees, but it leverages the concept of “bagging” to avoid overfitting and attempt to learn as much as possible from the entire data sets, not just a few key features. We close by covering strengths and weaknesses of this model and providing some real-life examples.
More Episodes
Roger Loomis interviews Randy Beams and Tanner Boyle about how they use data science to measure the effectiveness of wellness programs.
Published 11/13/24
In this episode, Reko Daye, The Actuarial Foundation STEM Stars Program Manager talks with STEM Star, Emily Roman about how the STEM Stars program has positively changed her life. 
Published 11/13/24