Joint multiple dictionary learning for tensor sparse codingFu, Y., Gao, J., Sun, Y. and Hong, X. ORCID: https://orcid.org/0000-0002-6832-2298 (2014) Joint multiple dictionary learning for tensor sparse coding. In: 2014 International Joint Conference on Neural Networks (IJCNN), July 6-11, 2014., Beijing, China.
It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing. Official URL: http://dx.doi.org/10.1109/IJCNN.2014.6889490 Abstract/SummaryTraditional dictionary learning algorithms are used for finding a sparse representation on high dimensional data by transforming samples into a one-dimensional (1D) vector. This 1D model loses the inherent spatial structure property of data. An alternative solution is to employ Tensor Decomposition for dictionary learning on their original structural form —a tensor— by learning multiple dictionaries along each mode and the corresponding sparse representation in respect to the Kronecker product of these dictionaries. To learn tensor dictionaries along each mode, all the existing methods update each dictionary iteratively in an alternating manner. Because atoms from each mode dictionary jointly make contributions to the sparsity of tensor, existing works ignore atoms correlations between different mode dictionaries by treating each mode dictionary independently. In this paper, we propose a joint multiple dictionary learning method for tensor sparse coding, which explores atom correlations for sparse representation and updates multiple atoms from each mode dictionary simultaneously. In this algorithm, the Frequent-Pattern Tree (FP-tree) mining algorithm is employed to exploit frequent atom patterns in the sparse representation. Inspired by the idea of K-SVD, we develop a new dictionary update method that jointly updates elements in each pattern. Experimental results demonstrate our method outperforms other tensor based dictionary learning algorithms.
Download Statistics DownloadsDownloads per month over past year Deposit Details References University Staff: Request a correction | Centaur Editors: Update this record |