Machine Learning
Exercises Chapter 8
March 16, 2009
1. Instance based classification requires a distance measure. When data can be represented as n-
dimensional vectors
pP with numeric or symbolic components, 0 a generally usable distance function
is: d(x, y) = wi .d0 (xi , yi )2 , where for symbolic values d is defined by d0 (x, y) = 0 ⇔ x = y,
otherwise 1; and for numerical values d0 (x, y) = |x−y|. Note that for purely numerical vectors, with
wi = 1 we obtain the Euclidean distance, whereas for purely symbolic vectors the same weights
give us the Hamming distance.
In this exercise the instance space is true, f alse × R. Unless specified otherwise, use as a distance
criterion the function mentioned above. Consider the following points:
Example A B Class
1 true 1 +
2 f alse 3 +
3 true 5 +
4 f alse 2 −
a) Classify (f alse, 4) with Nearest Neighbor.
b) Classify (f alse, 4) with 3NN.
c) Suggest a prototype for all positive examples and a prototype for all negative examples. Why
do you make this suggestions?
d) Classify (f alse, 4) using the prototypes.
2. Sketch a Voronoi map for the 1NN method for the following examples:
a)
Example x y Class
1 0 0 −
2 0 −3 −
3 −2 −2 −
4 0 5 +
5 2 3 +
6 −2 2 +
7 −5 0 +
b)
Example x y Class
1 0 0 −
2 2 0 −
3 −2 0 −
4 1 2 +
5 −1 2 +
6 −4 2 +
7 4 2 +
1
Exercises Chapter 8
March 16, 2009
1. Instance based classification requires a distance measure. When data can be represented as n-
dimensional vectors
pP with numeric or symbolic components, 0 a generally usable distance function
is: d(x, y) = wi .d0 (xi , yi )2 , where for symbolic values d is defined by d0 (x, y) = 0 ⇔ x = y,
otherwise 1; and for numerical values d0 (x, y) = |x−y|. Note that for purely numerical vectors, with
wi = 1 we obtain the Euclidean distance, whereas for purely symbolic vectors the same weights
give us the Hamming distance.
In this exercise the instance space is true, f alse × R. Unless specified otherwise, use as a distance
criterion the function mentioned above. Consider the following points:
Example A B Class
1 true 1 +
2 f alse 3 +
3 true 5 +
4 f alse 2 −
a) Classify (f alse, 4) with Nearest Neighbor.
b) Classify (f alse, 4) with 3NN.
c) Suggest a prototype for all positive examples and a prototype for all negative examples. Why
do you make this suggestions?
d) Classify (f alse, 4) using the prototypes.
2. Sketch a Voronoi map for the 1NN method for the following examples:
a)
Example x y Class
1 0 0 −
2 0 −3 −
3 −2 −2 −
4 0 5 +
5 2 3 +
6 −2 2 +
7 −5 0 +
b)
Example x y Class
1 0 0 −
2 2 0 −
3 −2 0 −
4 1 2 +
5 −1 2 +
6 −4 2 +
7 4 2 +
1