×

zbMATH — the first resource for mathematics

Sequential non-stationary dynamic classification with sparse feedback. (English) Zbl 1187.68477
Summary: Many data analysis problems require robust tools for discerning between states or classes in the data. In this paper, we consider situations in which the decision boundaries between classes are potentially non-linear and subject to “concept drift” and hence static classifiers fail. The applications for which we present results are characterized by the requirement that robust online decisions be made and by the fact that target labels may be missing, so there is very often no feedback regarding the system’s performance. The inherent non-stationarity in the data is tracked using a non-linear dynamic classifier, the parameters of which evolve under an extended Kalman filter framework, derived using a sequential Bayesian-learning paradigm. The method is extended to take into account missing and incorrectly labeled targets and to actively request target labels. The method is shown to work well in simulation as well as when applied to sequential decision problems in medical signal analysis.

MSC:
68T10 Pattern recognition, speech recognition
68T05 Learning and adaptive systems in artificial intelligence
Software:
astsa
PDF BibTeX XML Cite
Full Text: DOI
References:
[1] Bishop, C.M., Neural networks for pattern recognition, (1995), Oxford University Press Oxford
[2] Bernardo, J.M.; Smith, A.F.M., Bayesian theory, (1994), Wiley New York
[3] Hastie, T.; Tibshirani, R., Discriminant adaptive nearest neighbor classification, IEEE transactions on pattern analysis and machine intelligence, 18, 6, 607-616, (1996)
[4] W. Penny, S. Roberts, Dynamic logistic regression, in: Proceedings of IJCNN-99, 1999.
[5] M. Niranjan, Sequential Bayesian computation of logistic regression models, in: Proceedings of Acoustics, Speech, and Signal Processing, 1999. ICASSP ’99. Proceedings, vol. 2, March 1999, pp. 1065-1068.
[6] Sykacek, P.; Roberts, S.; Stokes, M., Adaptive BCI based on variational Bayesian Kalman filtering: an empirical evaluation, IEEE transactions on biomedical engineering, 51, 5, 719-729, (2004)
[7] MacKay, D.J.C., A practical Bayesian framework for backpropagation networks, Neural computation, 4, 448-472, (1992)
[8] Roweis, S.; Ghahramani, Z., A unifying review of linear Gaussian models, Neural computation, 11, 2, 305-345, (1999)
[9] Haykin, S., Kalman filtering and neural networks, (2001), Wiley New York
[10] Shumway, R.H.; Stoffer, D.S., Time series analysis and its applications, (2000), Springer Berlin · Zbl 0502.62085
[11] Vaughan, T.M., Brain computer interface technology: a review of the second international meeting, IEEE transactions on rehabilitation engineering, 11, 2, 94-109, (2003)
[12] Shenoy, P.; Kraudelat, M.; Blankertz, B.; Rao, R.; Müller, K-R., Towards adaptive classification for BCI, Journal of neural engineering, 3, 13-23, (2006)
[13] Nunez, P.L., Electric fields of the brain, (1981), Oxford University Press Oxford, New, York
[14] Pardey, J.; Roberts, S.; Tarassenko, L., A review of parametric modelling techniques for EEG analysis, Medical engineering and physics, 18, 1, 2-11, (1996)
[15] C. Andrieu, N. de Freitas, A. Doucet, Sequential MCMC for Bayesian model selection, in: IEEE Signal Processing Workshop on Higher Order Statistics, Ceasarea, Israel, June 14-16, 1999.
[16] E.A. Wan, R. van der Merwe, The unscented Kalman filter for nonlinear estimation, in: Proceedings of Symposium 2000 on Adaptive Systems for Signal Processing, Communication and Control (AS-SPCC), Lake Louise, Alberta, Canada, IEEE, 2000.
[17] J.-S. Lee, I.-S. Oh, Binary classification trees for multi-class classification problems, in: ICDAR ’03: Proceedings of the Seventh International Conference on Document Analysis and Recognition, Edinburgh, Scotland, 2003, IEEE Computer Society, Silver Spring, MD, pp. 770-774.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.