zbMATH — the first resource for mathematics

Examples
Geometry Search for the term Geometry in any field. Queries are case-independent.
Funct* Wildcard queries are specified by * (e.g. functions, functorial, etc.). Otherwise the search is exact.
"Topological group" Phrases (multi-words) should be set in "straight quotation marks".
au: Bourbaki & ti: Algebra Search for author and title. The and-operator & is default and can be omitted.
Chebyshev | Tschebyscheff The or-operator | allows to search for Chebyshev or Tschebyscheff.
"Quasi* map*" py: 1989 The resulting documents have publication year 1989.
so: Eur* J* Mat* Soc* cc: 14 Search for publications in a particular source with a Mathematics Subject Classification code (cc) in 14.
"Partial diff* eq*" ! elliptic The not-operator ! eliminates all results containing the word elliptic.
dt: b & au: Hilbert The document type is set to books; alternatively: j for journal articles, a for book articles.
py: 2000-2015 cc: (94A | 11T) Number ranges are accepted. Terms can be grouped within (parentheses).
la: chinese Find documents in a given language. ISO 639-1 language codes can also be used.

Operators
a & b logic and
a | b logic or
!ab logic not
abc* right wildcard
"ab c" phrase
(ab c) parentheses
Fields
any anywhere an internal document identifier
au author, editor ai internal author identifier
ti title la language
so source ab review, abstract
py publication year rv reviewer
cc MSC code ut uncontrolled term
dt document type (j: journal article; b: book; a: book article)
An efficient discriminant-based solution for small sample size problem. (English) Zbl 1178.68489
Summary: Classification of high-dimensional statistical data is usually not amenable to standard pattern recognition techniques because of an underlying small sample size problem. To address the problem of high-dimensional data classification in the face of a limited number of samples, a novel Principal Component Analysis (PCA) based feature extraction/classification scheme is proposed. The proposed method yields a piecewise linear feature subspace and is particularly well-suited to difficult recognition problems where achievable classification rates are intrinsically low. Such problems are often encountered in cases where classes are highly overlapped, or in cases where a prominent curvature in data renders a projection onto a single linear subspace inadequate. The proposed feature extraction/classification method uses class-dependent PCA in conjunction with linear discriminant feature extraction and performs well on a variety of real-world datasets, ranging from digit recognition to classification of high-dimensional bioinformatics and brain imaging data.
MSC:
68T10Pattern recognition, speech recognition
References:
[1]Duda, R. O.; Hart, P. E.; Stork, D. G.: Pattern classification, (2001) · Zbl 0968.68140
[2]Fisher, R. A.: The use of multiple measurements in taxonomic problems, Ann. eugenics 7, 179-188 (1936)
[3]Rao, C. R.: The utilization of multiple measurements in problems of biological classification, J. R. Stat. soc. B methods 10, No. 2, 159-203 (1948) · Zbl 0034.07902
[4]Ii, P. M. Lewis: The characteristic selection problem in recognition systems, IEEE trans. Inf. theory 8, No. 2, 171-178 (1962) · Zbl 0099.34505
[5]H.P. Decell, J.A. Quirein, An iterative approach to the feature selection problem, in: Proceedings of Purdue Conference on Machine Processing of Remotely Sensed Data, 1972, pp. 3B1 – 3B12.
[6]Devijver, P. A.; Kittler, J.: Pattern recognition: A statistical approach, (1982) · Zbl 0542.68071
[7]Kumar, N.; Andreou, A. G.: Heteroscedastic discriminant analysis and reduced rank hmms for improved speech recognition, Speech commun. 26, No. 4, 283-297 (1998)
[8]G. Saon, M. Padmanabhan, Minimum Bayes error feature selection for continuous speech recognition, in: Advances in Neural Information Processing Systems, vol. 13, MIT Press, Cambridge, MA, 2001, pp. 800 – 806.
[9]Torkkola, K.: Discriminative features for document classification, Proceedings 16th international conference on pattern recognition 1, 472-475 (2002)
[10]Loog, M.; Duin, R. P. W.: Linear dimensionality reduction via a heteroscedastic extension of LDA: the Chernoff criterion, IEEE trans. Pattern anal. 26, 732-739 (2004)
[11]Nenadic, Z.: Information discriminant analysis: feature extraction with and information theoretic objective, IEEE trans. Pattern anal. 29, No. 8, 1394-1407 (2007)
[12]Das, K.; Nenadic, Z.: Approximate information discriminant analysis: a computationally simple heteroscedastic feature extraction technique, Pattern recognition 41, No. 5, 1565-1574 (2008) · Zbl 1140.68460 · doi:10.1016/j.patcog.2007.10.001
[13]Fukunaga, K.: Introduction to statistical pattern recognition, (1990) · Zbl 0711.62052
[14]Turk, M.; Pentland, A.: Eigenfaces for recognition, J. cognitive neurosci. 3, No. 1, 71-86 (winter 1991)
[15]Belhumeur, P. N.; Hespanha, J. P.; Kriegman, D. J.: Eigenfaces vs. Fisherfaces: recognition using class specific linear projection, IEEE trans. Pattern anal. 19, No. 7, 711-720 (1997)
[16]Pontil, M.; Verri, A.: Support vector machines for 3d object recognition, IEEE trans. Pattern anal. 20, No. 6, 637-646 (1998)
[17]Golub, T. R.; Slonim, D. K.; Tamayo, P.; Huard, C.; Gaasenbeek, M.; Mesirov, J. P.; Coller, H.; Loh, M. L.; Downing, J. R.; Caligiuri, M. A.; Bloomfield, C. D.; Lander, E. S.: Molecular classification of cancer: class discovery and class prediction by gene expression monitoring, Science 286, No. 5439, 531-537 (1999)
[18]Tusher, V. G.; Tibshirani, R.; Chu, G.: Significance analysis of microarrays applied to the ionizing radiation response, Proc. natl. Acad. sci. USA 98, No. 9, 5116-5121 (2001) · Zbl 1012.92014 · doi:10.1073/pnas.091062498 · doi:http://www.pnas.org/cgi/content/abstract/98/9/5116
[19]Rizzuto, D. S.; Mamelak, A. N.; Sutherling, W. W.; Fineman, I.; Andersen, R. A.: Spatial selectivity in human ventrolateral prefrontal cortex, Nat. neurosci. 8, 415-417 (2005)
[20]Nenadic, Z.; Rizzuto, D. S.; Andersen, R. A.; Burdick, J. W.: Advances in cognitive neural prosthesis: recognition of neural data with an information-theoretic objective, Toward brain computer interfacing, 175-190 (2007)
[21]Huber, P. J.: Projection pursuit, Ann. stat. 13, No. 2, 435-475 (1985)
[22]Schäfer, J.; Strimmer, K.: A shrinkage approach to large-scale covariance matrix estimation and implications for functional genomics, Stat. appl. Genet. mol. Biol. 4, No. 1 (2005)
[23]Daniels, M. J.; Kass, R. E.: Shrinkage estimators for covariance matrices, Biometrics 57, 1173-1184 (2001) · Zbl 1209.62132 · doi:10.1111/j.0006-341X.2001.01173.x
[24]Hastie, T.; Tibshirani, R.; Friedman, J.: The elements of statistical learning, (2001)
[25]E.P. Xing, M.I. Jordan, R.M. Karp, Feature selection for high-dimensional genomic microarray data, in: ICML ’01: Proceedings of the Eighteenth International Conference on Machine Learning, San Francisco, CA, USA, Morgan Kaufmann Publishers Inc., 2001, pp. 601 – 608.
[26]C.E. Thomaz, D.F. Gillies, R.Q. Feitosa, Small sample problem in Bayes plug-in classifier for image recognition, in: International Conference on Image and Vision Computing New Zealand, 2001, pp. 295 – 300.
[27]Hoffbeck, J. P.; Landgrebe, D. A.: Covariance matrix estimation and classification with limited training data, IEEE trans. Pattern anal. 18, No. 7, 763-767 (1996)
[28]Jolliffe, I. T.: Principal component analysis, (1986)
[29]Chen, L. -F.; Liao, H. -Y.M.; Ko, M. -T.; Lin, J. -C.; Yu, G. -J.: A new LDA-based face recognition system which can solve the small sample size problem, Pattern recognition 33, No. 10, 1713-1726 (2000)
[30]Yu, H.; Yang, H.: A direct LDA algorithm for high-dimensional data — with application to face recognition, Pattern recognition lett. 34, No. 10, 2067-2070 (2001) · Zbl 0993.68091 · doi:10.1016/S0031-3203(00)00162-X
[31]Wang, X.; Tang, X.: Dual-space linear discriminant analysis for face recognition, 2004 IEEE computer society conference on computer vision and pattern recognition (CVPR’04) 02, 564-569 (2004)
[32]Zhu, M.; Martinez, A. M.: Subclass discriminant analysis, IEEE trans. Pattern anal. 28, No. 8, 1274-1286 (2006)
[33]Kirby, M.; Sirovich, L.: Application of the Karhunen – Loève procedure for the characterization of human faces, IEEE trans. Pattern anal. 12, 103-108 (1990)
[34]R. Huang, Q. Liu, H. Lu, S. Ma, Solving the small sample size problem of lda, in: ICPR ’02: Proceedings of the 16th International Conference on Pattern Recognition (ICPR’02), vol. 3, 2002, pp. 29 – 32.
[35]Zhu, M.; Martinez, A. M.: Selecting principal components in a two-stage lda algorithm, , 132-137 (2006)
[36]Zhang, S.; Sim, T.: Discriminant subspace analysis: a fukunaga – koontz approach, IEEE trans. Pattern anal. 29, No. 10, 1732-1745 (2007)
[37]Gao, H.; Davis, J. W.: Why direct lda is not equivalent to lda, Pattern recognition 39, No. 5, 1002-1006 (2006) · Zbl 1158.68467 · doi:10.1016/j.patcog.2005.11.016
[38]Fukunaga, K.; Koontz, W. L. G.: Application of the Karhunen – Loève expansion to feature selection and ordering, IEEE trans. Comput. 19, No. 4, 311-318 (1970) · Zbl 0197.14604 · doi:10.1109/T-C.1970.222918
[39]Schölkopf, B.; Smola, A.; Müller, K. -R.: Nonlinear component analysis as a kernel eigenvalue problem, Neural comput. 10, No. 5, 1299-1319 (1998)
[40]Baudat, G.; Anouar, F.: Generalized discriminant analysis using a kernel approach, Neural comput. 12, No. 10, 2385-2404 (2000)
[41]S. Mika, G. Ratsch, J. Weston, B. Scholkopf, K.-R. Müller, Fisher discriminant analysis with kernels, in: Proceedings of IEEE Neural Networks for Signal Processing Workshop, 1999, pp. 41 – 48.
[42]Mika, S.; Weston, J.; Smola, A.: Invariant feature extraction and classification in kernel spaces, Advances in neural information processing systems 12, 526-532 (2000)
[43]Mika, S.; Rätsch, G.; Weston, J.; Schölkopf, B.; Smola, A.; Müller, K. -R.: Constructing descriptive and discriminative nonlinear features: Rayleigh coefficients in kernel feature spaces, IEEE trans. Pattern anal. 25, No. 5, 623-633 (2003)
[44]Yang, M.: Kernel eigenfaces vs. Kernel fisherfaces: face recognition using kernel methods, , 215 (2002)
[45]Yang, J.; Frangi, A. F.; Yang, J. -Y.; Jin, Z.: KPCA plus LDA: a complete kernel Fisher discriminant framework for feature extraction and recognition, IEEE trans. Pattern anal. 27, No. 2, 230-244 (2005)
[46]Roweis, S. T.; Saul, L. K.: Nonlinear dimensionality reduction by locally linear embedding, Science 290, 2323-2326 (2000)
[47]Tenenbaum, J. B.; De Silva, V.; Langford, J. C.: A global geometric framework for nonlinear dimensionality reduction, Science 290, 2319-2323 (2000)
[48]He, X.; Yan, S.; Hu, Y.; Niyogi, P.; Zhang, H. -J.: Face recognition using laplacianfaces, IEEE trans. Pattern anal. 27, No. 3, 328-340 (2005)
[49]Y. Fu, T.S. Huang, Locally linear embedded eigenspace analysis, U.I.U.C. Technical Report IFP-TR, ECE, 2005.
[50]H.-T. Chen, H.-W. Chang, T.-L. Liu, Local discriminant embedding and its variants, in: Proceedings of Computer Vision and Pattern Recognition, CVPR, vol. 2, June 2005, pp. 846 – 853.
[51]Yan, S.; Xu, D.; Zhang, B.; Zhang, H. -J.; Yang, Q.; Lin, S.: Graph embedding and extensions: a general framework for dimensionality reduction, IEEE trans. Pattern anal. 29, No. 1, 40-51 (2007)
[52]Fu, Y.; Yan, S.; Huang, T. S.: Classification and feature extraction by simplexization, IEEE trans. Inf. forensics secur. 3, No. 1, 91-100 (2008)
[53]Fu, Y.; Huang, T. S.: Image classification using correlation tensor analysis, IEEE trans. Image process. 17, No. 2, 226-234 (2008)
[54]Xiaofei, H.; Niyogi, P.: Locality preserving projections, Advances in neural information processing systems (NIPS) 18 (2003)
[55]Vasilescu, M. A. O.; Terzopoulos, D.: Multilinear subspace analysis of image ensembles, Proceedings of the IEEE computer society conference on computer vision and pattern recognition 2 (2003)
[56]Ye, J.; Janardan, R.; Li, Q.: Two-dimensional linear discriminant analysis, Advances in neural information processing systems 17, 1569-1576 (2005)
[57]He, X.; Cai, D.; Niyogi, P.: Tensor subspace analysis, Advances in neural information processing systems (NIPS) 18, 499-506 (2006)
[58]K. Das, J. Meyer, Z. Nenadic, Analysis of large-scale brain data for brain – computer interfaces, in: Proceedings of the 28th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2006, pp. 5731 – 5734.
[59]K. Das, S. Osechinskiy, Z. Nenadic, A classwise pca-based recognition of neural data for brain-computer interfaces, in: Proceedings of the 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2007, pp. 6519 – 6522.
[60]Meyer, F. G.; Shen, X.: Classification of fMRI time series in a low-dimensional subspace with a spatial prior, IEEE trans. Med. imaging 27, No. 1, 87-98 (2008)
[61]Wolpaw, J. R.; Mcfarland, D. J.: Control of a two-dimensional movement signal by a noninvasive brain-computer interface in humans, Proc. natl. Acad. sci. USA 101, No. 51, 17849-17854 (2004)
[62]Pfurtscheller, G.; Neuper, C.; Flotzinger, D.; Pregenzer, M.: EEG-based discrimination between imagination of right and left hand movement, Electroen. clin. Neuro. 103, No. 6, 642-651 (1997)
[63]S. Shan, W. Gao, X. Chen, J. Ma, Novel face recognition based on individual eigen-subspaces, in: Signal Processing Proceedings, 2000, 5th International Conference on WCCC-ICSP 2000, vol. 3, 21 – 25 August 2000, pp. 1522 – 1525.
[64]Shan, S. G.; Gao, W.; Zhao, D.: Face recognition based on face-specific subspace, Int. J. Imaging syst. Technol. 13, No. 1, 23-32 (2003)
[65]Liu, Z. -Y.; Chiu, K. -C.; Xu, L.: Improved system for object detection and star/galaxy classification via local subspace analysis, Neural networks 16, No. 3 – 4, 437-451 (2003)
[66]Min, W.; Lu, K.; He, X.: Locality pursuit embedding, Pattern recognition 37, No. 4, 781-788 (2004)
[67]Mallat, S.: A wavelet tour of signal processing, (1999) · Zbl 0998.94510
[68]Donoho, D. L.: De-noising by soft thresholding, IEEE trans. Inf. theory 41, No. 3, 613-627 (1995) · Zbl 0820.62002 · doi:10.1109/18.382009
[69]Parzen, E.: On the estimation of a probability density function and mode, Ann. math. Stat. 33, 1065-1076 (1962) · Zbl 0116.11302 · doi:10.1214/aoms/1177704472
[70]Tubbs, J. D.; Coberly, W. A.; Young, D. M.: Linear dimension reduction and Bayes classification with unknown population parameters, Pattern recognition 15, No. 3, 167-172 (1982) · Zbl 0491.62047 · doi:10.1016/0031-3203(82)90068-1
[71]Brunzell, H.; Eriksson, J.: Feature reduction for classification of multidimensional data, Pattern recognition 33, No. 10, 1741-1748 (2000)
[72]S. Hettich, C.L. Blake, C.J. Merz, UCI repository of machine learning databases nbsp;http://www.ics.uci.edu/sim;mlearn/MLRepository.htmlnbsp;, Department of Information and Computer Sciences, University of California, Irvine, 1998.
[73]Kittler, J.; Hatef, M.; Duin, R. P. W.; Matas, J.: On combining classifiers, IEEE trans. Pattern anal. 20, No. 3, 226-239 (1998)
[74]Michalski, R. S.: Learning by being told and learning from examples: an experimental comparison of the two methods of knowledge acquisition in the context of developing an expert system for soybean disease diagnosis, Int. J. Pol anal. Inf. syst. 4, No. 2, 125-161 (1980)
[75]Alon, U.; Barkai, N.; Notterman, D. A.; Gish, K.; Ybarra, S.; Mack, D.; Levine, A. J.: Broad patterns of gene expression revealed by clustering analysis of tumor and normal colon tissues probed by oligonucleotide arrays, Proc. natl. Acad. sci. USA 96, No. 12, 6745-6750 (1999)
[76]Petricoin, E. F.; Ardekani, A. M.; Hitt, B. A.; Levine, P. J.; Fusaro, V. A.; Steinberg, S. M.; Mills, G. B.; Simone, C.; Fishman, D. A.; Kohn, E. C.; Liotta, L. A.: Use of proteomic patterns in serum to identify ovarian cancer, Lancet 359, No. 9306, 572-577 (2002)
[77]Torkkola, K.: Feature extraction by non-parametric mutual information maximization, J. Mach learn. Res. 3, 1415-1438 (2003) · Zbl 1102.68638 · doi:10.1162/153244303322753742
[78]R. Kohavi, A study of cross-validation and bootstrap for accuracy estimation and model selection, in: International Joint Conference on Artificial Intelligence, 1995, pp. 1137 – 1145.