On Mar 13, 2007, at 23:27 , Peter Drake wrote:
Hmmm -- p. 735 of Russell & Norvig's AI text contains a strong argument
that "nearest-neighbor methods cannot be trusted for high- dimensional
data".
AFAIK that's because when you add more and more dimensions and try to
calculate the distance
b
Given your original description I'm not so sure if what you're doing should
actually be called a nearest neighbor method. It may be more like a decision
tree... This said, the curse of dimensionality is a general problem which
can come up with all similarity/distance based approaches using finite
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
On Mar 13, 2007, at 23:27 , Peter Drake wrote:
Hmmm -- p. 735 of Russell & Norvig's AI text contains a strong
argument that "nearest-neighbor methods cannot be trusted for high-
dimensional data".
AFAIK that's because when you add more and more
Hmmm -- p. 735 of Russell & Norvig's AI text contains a strong
argument that "nearest-neighbor methods cannot be trusted for high-
dimensional data".
Peter Drake
http://www.lclark.edu/~drake/
On Mar 7, 2007, at 9:38 PM, Peter Drake wrote:
First, a general hypothesis on heuristics: one sho