Summary

Completed

Let’s summarize what we’ve covered.

Principally, we’ve tackled a complex classification problem using decision trees and random forests as an example. Our scenario was difficult–training a model to guess which people were medal winners for Rhythmic Gymnastics–but we pulled it off. Interestingly, we found a way to do so using only basic features: age, weight, height, and the year of the games.

We learned that to optimize complex models, we often have decisions to make about how the model will be structured, such as how large or deep it will be. We discussed how larger and more complex models are much more difficult to understand internally, once trained, but have often impressive performance gains over simpler model types.

We also practiced working with hyperparameters, which are settings that affect how training works. We found that hyperparameters can make large improvements to how well a model is trained, and that finding the optimal selection requires both reasoning and experimentation.