## Soft and hard classification by reproducing kernel Hilbert space methods.(English)Zbl 1106.62338

Summary: Reproducing kernel Hilbert space (RKHS) methods provide a unified context for solving a wide variety of statistical modelling and function estimation problems. We consider two such problems: We are given a training set $$\{y_i,t_i$$, $$i=1,\dots,n\}$$, where $$y_i$$ is the response for the $$i$$th subject, and $$t_i$$ is a vector of attributes for this subject. The value of $$y_i$$ is a label that indicates which category it came from. For the first problem, we wish to build a model from the training set that assigns to each $$t$$ in an attribute domain of interest an estimate of the probability $$p_j(t)$$ that a (future) subject with attribute vector $$t$$ is in category $$j$$. The second problem is in some sense less ambitious; it is to build a model that assigns to each $$t$$ a label, which classifies a future subject with that $$t$$ into one of the categories or possibly “none of the above”. The approach to the first of these two problems discussed here is a special case of what is known as penalized likelihood estimation. The approach to the second problem is known as the support vector machine. We also note some alternate but closely related approaches to the second problem. These approaches are all obtained as solutions to optimization problems in RKHS. Many other problems, in particular the solution of ill-posed inverse problems, can be obtained as solutions to optimization problems in RKHS and are mentioned in passing. We caution the reader that although a large literature exists in all of these topics, in this inaugural article we are selectively highlighting work of the author, former students, and other collaborators.

### MSC:

 62H30 Classification and discrimination; cluster analysis (statistical aspects) 46N30 Applications of functional analysis in probability theory and statistics

gss
Full Text:

### References:

 [1] ADV COMPUT MATH 13 pp 1– (2000) · Zbl 0939.68098 [2] 33 pp 82– (1971) · Zbl 0201.39702 [3] TRANS AM MATH SOC 68 pp 337– (1950) [4] 92 pp 107– (1997) [5] 81 pp 96– (1986) [6] ANN STATIST 23 pp 1865– (1995) · Zbl 0854.62042 [7] Klein, Archives of Ophthalmology 102 (4) pp 520– (1984) [8] COMMUN STAT SIMUL COMPUT 26 pp 765– (1997) [9] 1 pp 169– (1992) [10] STAT SIN 6 pp 675– (1996) [11] ANN STATIST 18 pp 1676– (1990) · Zbl 0719.62051 [12] ANN STATIST 28 pp 734– (2000) · Zbl 1105.62329 [13] DATA MINING KNOWL DISCOV 6 pp 259– (2002) · Zbl 05660804 [14] PNAS 98 (26) pp 15149– (2001) [15] MACH LEARN 46 pp 191– (2002) · Zbl 0998.68103 [16] NUMER MATH 31 pp 377– (1979) [17] Technometrics 21 pp 215– (1979) [18] ANN STATIST 14 pp 1101– (1986) · Zbl 0629.62043 [19] ANN STATIST 13 pp 970– (1985) · Zbl 0585.62074 [20] COMMUN STAT SIMUL 18 pp 1059– (1989) · Zbl 0695.62113 [21] NUMER MATH 56 pp 1– (1989) · Zbl 0665.65010 [22] MACH LEARN 46 pp 131– (2002) · Zbl 0998.68101 [23] CONSTR APPROXIMATION 2 pp 11– (1986) · Zbl 0625.41005 [24] MON WEATHER REV 123 pp 3358– (1995) [25] 79 pp 832– (1984) [26] J COMPUT PHYS 59 pp 441– (1985) · Zbl 0626.65053 [27] SIAM J NUMER ANAL 14 pp 651– (1977) · Zbl 0402.65032
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.