Multiple kernel clustering with neighbor-kernel subspace segmentation |
|
Author: | Zhou, Sihang1; Liu, Xinwang1; Li, Miaomiao2; |
Organizations: |
1School of Computer Science, National University of Defense Technology, Changsha 410073 2College of Changsha, Changsha 410073, China 3College of System Engineering, National University of Defense Technology, Changsha 410073, China
4Department of Machine Vision and Signal Analysis, University of Oulu, 90014 Oulu, Finland
5Technology and Engineering Group, Tencent Technology (Shenzhen) Co., Ltd., Shenzhen 518064, China 6School of Cyberspace Science, Dongguan University of Technology, Guangdong 523808, China |
Format: | article |
Version: | accepted version |
Access: | open |
Online Access: | PDF Full Text (PDF, 4.8 MB) |
Persistent link: | http://urn.fi/urn:nbn:fi-fe202001131854 |
Language: | English |
Published: |
Institute of Electrical and Electronics Engineers,
2020
|
Publish Date: | 2020-01-13 |
Description: |
AbstractMultiple kernel clustering (MKC) has been intensively studied during the last few decades. Even though they demonstrate promising clustering performance in various applications, existing MKC algorithms do not sufficiently consider the intrinsic neighborhood structure among base kernels, which could adversely affect the clustering performance. In this paper, we propose a simple yet effective neighbor-kernel-based MKC algorithm to address this issue. Specifically, we first define a neighbor kernel, which can be utilized to preserve the block diagonal structure and strengthen the robustness against noise and outliers among base kernels. After that, we linearly combine these base neighbor kernels to extract a consensus affinity matrix through an exact-rank-constrained subspace segmentation. The naturally possessed block diagonal structure of neighbor kernels better serves the subsequent subspace segmentation, and in turn, the extracted shared structure is further refined through subspace segmentation based on the combined neighbor kernels. In this manner, the above two learning processes can be seamlessly coupled and negotiate with each other to achieve better clustering. Furthermore, we carefully design an efficient iterative optimization algorithm with proven convergence to address the resultant optimization problem. As a by-product, we reveal an interesting insight into the exact-rank constraint in ridge regression by careful theoretical analysis: it back-projects the solution of the unconstrained counterpart to its principal components. Comprehensive experiments have been conducted on several benchmark data sets, and the results demonstrate the effectiveness of the proposed algorithm. see all
|
Series: |
IEEE transactions on neural networks and learning systems |
ISSN: | 2162-237X |
ISSN-E: | 2162-2388 |
ISSN-L: | 2162-237X |
Volume: | 31 |
Issue: | 4 |
Pages: | 1351 - 1362 |
DOI: | 10.1109/TNNLS.2019.2919900 |
OADOI: | https://oadoi.org/10.1109/TNNLS.2019.2919900 |
Type of Publication: |
A1 Journal article – refereed |
Field of Science: |
113 Computer and information sciences |
Subjects: | |
Copyright information: |
© 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. |