×

zbMATH — the first resource for mathematics

Learning a metric when clustering data points in the presence of constraints. (English) Zbl 07205272
Summary: Learning an appropriate distance measure under supervision of side information has become a topic of significant interest within machine learning community. In this paper, we address the problem of metric learning for constrained clustering by considering three important issues: (1) considering importance degree for constraints, (2) preserving the topological structure of data, and (3) preserving some natural distribution properties in the data. This work provides a unified way to handle different issues in constrained clustering by learning an appropriate distance measure. It has modeled the first issue by injecting the importance degree of constraints directly into an objective function. The topological structure of data is preserved by minimizing the reconstruction error of data in the target space. Finally we addressed the issue of preserving natural distribution properties in the data by using the proximity information of data. We have proposed two different methods to address the above mentioned issues. The first approach learns a linear transformation of data into a target space (linear-model) and the second one uses kernel functions to learn an appropriate distance measure (non-linear-model). Experiments show that considering these issues significantly improves clustering accuracy.
MSC:
30C40 Kernel functions in one complex variable and applications
54E35 Metric spaces, metrizability
91C20 Clustering in the social and behavioral sciences
Software:
MNIST
PDF BibTeX XML Cite
Full Text: DOI
References:
[1] Abin, Aa, Clustering with side information: further efforts to improve efficiency, Pattern Recognit Lett, 84, 252-258 (2016)
[2] Abin, Aa, Querying beneficial constraints before clustering using facility location analysis, IEEE Trans Cybern, 99, 1-12 (2016)
[3] Abin, Aa; Beigy, H., Active selection of clustering constraints: a sequential approach, Pattern Recognit, 47, 3, 1443-1458 (2014)
[4] Abin, Aa; Beigy, H., Active constrained fuzzy clustering: a multiple kernels learning approach, Pattern Recognit, 48, 3, 953-967 (2015) · Zbl 1373.68307
[5] Arenas, A.; Diaz-Guilera, A.; Kurths, J.; Moreno, Y.; Zhou, C., Synchronization in complex networks, Phys Rep, 469, 3, 93-153 (2008)
[6] Baghshah, Ms; Afsari, F.; Shouraki, Sb; Eslami, E., Scalable semi-supervised clustering by spectral kernel learning, Pattern Recognit Lett, 45, 161-171 (2014)
[7] Bar-Hillel, A.; Hertz, T.; Shental, N.; Weinshall, D., Learning a Mahalanobis metric from equivalence constraints, J Mach Learn Res, 6, 937-965 (2005) · Zbl 1222.68140
[8] Basu S, Banerjee A, Mooney RJ (2004) Active semi-supervision for pairwise constrained clustering. In: Proceedings of the 5th SIAM international conference on data mining, ICDM ’04, pp 333-344
[9] Basu, S.; Davidson, I.; Wagstaff, Kl, Constrained clustering: advances in algorithms, theory, and applications (2008), Boca Raton: Chapman and Hall/CRC, Boca Raton
[10] Bilenko M, Basu S, Mooney RJ (2004a) Integrating constraints and metric learning in semi-supervised clustering. In: Proceedings of the 21th international conference on Machine learning, ICML ’04, pp 11-18
[11] Bilenko M, Basu S, Mooney RJ (2004b) Integrating constraints and metric learning in semi-supervised clustering. In: Proceedings of the twenty-first international conference on Machine learning, ACM, p 11
[12] Chang H, yan Yeung D (2004) Locally linear metric adaptation for semi-supervised clustering. In: Proceedings of the 21th international conference on machine learning, ICML ’04, pp 153-160
[13] Cheng, H.; Hua, Ka; Vu, K., Constrained locally weighted clustering, Proc VLDB Endow, 1, 1, 90-101 (2008)
[14] Davidson, Ian; Wagstaff, Kiri L.; Basu, Sugato, Measuring Constraint-Set Utility for Partitional Clustering Algorithms, Lecture Notes in Computer Science, 115-126 (2006), Berlin, Heidelberg: Springer Berlin Heidelberg, Berlin, Heidelberg
[15] Ding, S.; Jia, H.; Zhang, L.; Jin, F., Research of semi-supervised spectral clustering algorithm based on pairwise constraints, Neural Comput Appl, 24, 1, 211-219 (2014)
[16] Gong, C.; Fu, K.; Wu, Q.; Tu, E.; Yang, J., Semi-supervised classification with pairwise constraints, Neurocomputing, 139, 130-137 (2014)
[17] He, P.; Xu, X.; Hu, K.; Chen, L., Semi-supervised clustering via multi-level random walk, Pattern Recognit, 47, 2, 820-832 (2014) · Zbl 1326.68224
[18] Hertz T, Bar-hillel A, Weinshall D (2004) Boosting margin based distance functions for clustering. In: Proceedings of the 21th international conference on machine learning, ICML ’04, pp 393-400
[19] Hoi SCH, Jin R, Lyu MR (2007) Learning non-parametric kernel matrices from pairwise constraints. In: Proceedings of the 24th international conference on Machine learning, ICML ’07, pp 361-368
[20] Hubert, L.; Arabie, P., Comparing Partitions, J Classif, 2, 1, 193-218 (1985)
[21] Jain, P.; Kulis, B.; Davis, Jv; Dhillon, Is, Metric and kernel learning using a linear transformation, J Mach Learn Res, 13, 519-547 (2012) · Zbl 1283.68290
[22] Kalakech, M.; Biela, P.; Macaire, L.; Hamad, D., Constraint scores for semi-supervised feature selection: a comparative study, Pattern Recognit Lett, 32, 5, 656-665 (2011)
[23] Khoreva A, Galasso F, Hein M, Schiele B (2014) Learning must-link constraints for video segmentation based on spectral clustering. In: Pattern recognition, Springer, pp 701-712
[24] Korinna Bade, An, Hierarchical constraints, Mach Learn, 94, 3, 371-399 (2014) · Zbl 1319.68167
[25] Kulis, B.; Basu, S.; Dhillon, I.; Mooney, R., Semi-supervised graph clustering: a kernel approach, Mach Learn, 74, 1, 1-22 (2009)
[26] LeCun Y, Cortes C (2010) MNIST handwritten digit database http://yann.lecun.com/exdb/mnist/
[27] Liu, H.; Wu, Z.; Li, X.; Cai, D.; Huang, T., Constrained non-negative matrix factorization for image representation, IEEE Trans Pattern Anal Mach Intell, 34, 7, 1299-1311 (2012)
[28] Melnykov, V.; Melnykov, I.; Michael, S., Semi-supervised model-based clustering with positive and negative constraints, Adv Data Anal Classif, 10, 3, 327-349 (2016) · Zbl 1414.62255
[29] Milligan, G.; Cooper, M., A study of the comparability of external criteria for hierarchical cluster analysis, Multivariate Behav Res, 21, 4, 441-458 (1986)
[30] Okabe M, Yamada S (2009) Clustering with constrained similarity learning. In: Proceedings of the 2009 IEEE/WIC/ACM international conference on web intelligence and international conference on intelligent agent technology, pp 30-33
[31] Roweis, St; Saul, Lk, Nonlinear dimensionality reduction by locally linear embedding, Science, 290, 2323-2326 (2000)
[32] Sinkkonen, J.; Kaski, S., Clustering based on conditional distributions in an auxiliary space, Neural Comput, 14, 217-239 (2002) · Zbl 1009.62048
[33] Smieja, M.; Wiercioch, M., Constrained clustering with a complex cluster structure, Adv Data Anal Classif, 11, 3, 493-518 (2017) · Zbl 1414.62273
[34] Soleymani Baghshah, M.; Bagheri Shouraki, S., Non-linear metric learning using pairwise similarity and dissimilarity constraints and the geometrical structure of data, Pattern Recognit, 43, 2982-2992 (2010) · Zbl 1207.68261
[35] Truong, D.; Battiti, R., A flexible cluster-oriented alternative clustering algorithm for choosing from the pareto front of solutions, Mach Learn, 98, 1, 57-91 (2013) · Zbl 1321.68412
[36] Vu, Vv; Labroche, N.; Bouchon-Meunier, B., Improving constrained clustering with active query selection, Pattern Recognit, 45, 4, 1749-1758 (2012)
[37] Wagstaff K, Cardie C (2000) Clustering with instance-level constraints. In: Proceedings of the seventeenth national conference on artificial intelligence and twelfth conference on on innovative applications of artificial intelligence, July 30-August 3, 2000, Austin, Texas, USA, p 1097
[38] Wagstaff K, Cardie C, Rogers S, Schrödl S (2001) Constrained k-means clustering with background knowledge. In: Proceedings of the 18th international conference on machine learning, ICML ’01, pp 577-584
[39] Wang, Q.; Yuen, Pc; Feng, G., Semi-supervised metric learning via topology preserving multiple semi-supervised assumptions, Pattern Recognit, 46, 9, 2576-2587 (2013) · Zbl 1323.68447
[40] Wu, S.; Feng, X.; Zhou, W., Spectral clustering of high-dimensional data exploiting sparse representation vectors, Neurocomputing, 135, 229-239 (2014)
[41] Xiang, S.; Nie, F.; Zhang, C., Learning a Mahalanobis distance metric for data clustering and classification, Pattern Recognit, 41, 12, 3600-3612 (2008) · Zbl 1162.68642
[42] Ye J, Zhao Z, Liu H (2007) Adaptive distance metric learning for clustering. In: Proceedings of the IEEE computer society conference on computer vision and pattern recognition (CVPR 2007), pp 1-7
[43] Yin, X.; Chen, S.; Hu, E.; Zhang, D., Semi-supervised clustering with metric learning: an adaptive kernel method, Pattern Recognit, 43, 4, 1320-1333 (2010) · Zbl 1192.68623
[44] Zhang, Z.; Zhao, M.; Chow, Tws, Marginal semi-supervised sub-manifold projections with informative constraints for dimensionality reduction and recognition, Neural Netw, 36, 97-111 (2012) · Zbl 1258.68132
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.