Lecture 8: Decision Trees
Today’s Topics
Decision trees are another powerful machine learning technique, and has the advantage that it is easy to see what the algorithm has learned. Constructing the optimal decision tree for an arbitrary data set is yet another NP-hard question, and there are many algorithms available. In this lecture you will look at one of the simpler algorithm ID3. ID3 uses information theory to decide how to construct a tree.
Slides and Notebooks
- Lecture slides
- Python notebook used to prepare the lecture. You can view it here
Reading Guide
Information theory.
You do not need to know much about information theory to apply the ID3 algorithm, but the first five chapters of An introduction to information theory: symbols, signals & noise by Pierce, John R will give some background to people who are interested. The book is available online at the University library.
Decision Trees
See the excellent notes by Richard Johansson from Chalmers
What should I know by the end of this lecture?
- What are decision trees?
- What is Shannon entropy and how does it measure information?
- How do I calculate the Shannon entropy of a distribution?
- How does the ID3 learning algorithm work?