Journal of South China University of Technology(Natural Science Edition) ›› 2019, Vol. 47 ›› Issue (8): 77-83,95.doi: 10.12141/j.issn.1000-565X.180398

• Computer Science & Technology • Previous Articles     Next Articles

Distributed Low-rank Tensor Subspace Clustering Algorithm

LIU Xiaolan1,2 PAN Gan1 YI Miao3 LI Zhipeng4   

  1. 1. School of Mathematics,South China University of Technology,Guangzhou 510640,Guangdong,China; 2. State Key Laboratory for Novel Software Technology,Nanjing University,Nanjing 210023,Jiangsu,China; 3. College of Physical Science and Technology,Yichun University,Yichun 336000,Jiangxi,China; 4. School of Computer Science and Engineering,South China University of Technology,Guangzhou 510006,Guangdong,China
  • Received:2018-08-11 Revised:2019-02-18 Online:2019-08-25 Published:2019-08-01
  • Contact: 刘小兰(1979-),女,博士,副教授,主要从事优化算法与机器学习研究. E-mail:liuxl@scut.edu.cn
  • About author:刘小兰(1979-),女,博士,副教授,主要从事优化算法与机器学习研究.
  • Supported by:
    Supported by the National Natrual Science Foundation of China(61502175,61273295) and the Natrual Science Foundation of Guangdong Province(2016A030313545)

Abstract: Subspace clustering algorithm based on low-rank representation (LRR) cannot handle large-scale data effectively,and distributed low-rank subspace clustering algorithm (DFC-LRR) cannot handle the high-dimensional data directly. To solve this issue,a distributed low-rank subspace clustering algorithm based on tensor and distribu- ted computing was proposed. The proposed method firstly considered high-dimensional data as tensor and extended LRR subspace clustering algorithm to high-dimensional data by introducing tensorial multiplication into self repre- sentation of data. Then the low-rank coefficient tensor was obtained through the distributed parallel computing,and get the sparse similarity matrix by sparing every lateral slices of the coefficient tensor. Experimental results on the Extended Yale B,COIL20 and UCSD datasets show that the proposed algorithm outperforms DFC-LRR in clustering accuracy,and distributed computing can reduce the running time obviously.

Key words: low-rank representation, subspace clustering, distributed computing, tensor

CLC Number: