Exercise 3.1.d

Truth table:




Entropy and Information gain per attribute:

3_1_d_CalculationsDecision Tree (solution):


Exercise 3.1.c XOR

This is the well known XOR Truth table:


This is the summary, notice that the Entropy of the Set is 1.


Notice that the Entropy of every attribute value is 1 and the Information Gain of every attribute is zero.


This is Decision Tree (solution):


Exercise 3.1.b

This is the truth table:


This is a summary obtained from the truth table, the entropy is for the whole set:3_1_b_TruthTableSummary

From the truth table, we calculate the Entropy and Gain of every attribute:


ID3 uses the Information Gain to decide what node to attribute to use as the next node in the tree, in this case we select A as the root node since it has the highest Information Gain.

This is the decision tree (solution):

Solutions to exercises found in Machine Learning by Tom M. Mitchell

I’m taking my 3rd class in the OMSCS program by Georgia Tech which is Machine Learning by Prof. Charles Isbell and Prof. Michael Littman (I previously took Computer Vision by Prof. Aaron Bobick and Knowledge Based AI by Prof. David Joyner)

The book that we are using is Machine Learning by Tom M. Mitchell. At the end of every chapter there is a set of exercises, as I working through the exercises I often found myself wanting to corroborate my solution to the problem but I couldn’t find it so I decided to document them on my blog to be of help to others like me.

If you find a mistake or a better way to do this, please let me know as I would be more than happy to learn from you!

I will do my best to be diligent and constant at blogging but this is going to be a low priority task (more like the ‘idle’ task in my scheduler).


Links to solutions:

Exercise 3.1.a

Exercise 3.1.b

Exercise 3.1.c

Exercise 3.1.d

Exercise 3.2