A New Approximation Method of the Quadratic Discriminant Function
Shin'ichiro Omachi, Fang Sun, and Hirotomo Aso
Lecture Notes in Computer Science, vol.1876
(Joint IAPR International Workshops SSPR 2000 and SPR 2000),
pp.601-610, August/September 2000
Abstract
For many statistical pattern recognition methods,
distributions of sample vectors are assumed to be normal,
and the quadratic discriminant function derived from
the probability density function of multivariate normal distribution
is used for classification.
However, the computational cost is $O(n^2)$ for $n$-dimensional vectors.
Moreover, if there are not enough training sample patterns,
covariance matrix can not be estimated accurately.
In the case that the dimensionality is large, these disadvantages
markedly reduce classification performance.
In order to avoid these problems,
in this paper, a new approximation method of the quadratic discriminant
function is proposed.
This approximation is done by replacing the values of small eigenvalues
by a constant which is estimated by the maximum likelihood estimation.
This approximation not only reduces the computational cost but also improves
the classification accuracy.
Full paper
PDF
Gzipped Postscript