Accessibility navigation

Knowledge distillation based semantic communications for multiple users

Liu, C., Zhou, Y., Chen, Y. and Yang, S.-H. (2023) Knowledge distillation based semantic communications for multiple users. IEEE Transactions on Wireless Communication. ISSN 1558-2248

Text - Accepted Version
· Please see our End User Agreement before downloading.


It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing.

To link to this item DOI: 10.1109/TWC.2023.3336941


Deep learning (DL) has shown great potential in revolutionizing the traditional communications system. Many applications in communications have adopted DL techniques due to their powerful representation ability. However, the learning based methods can be dependent on the training dataset and perform worse on unseen interference due to limited model generalizability and complexity. In this paper, we consider the semantic communication (SemCom) system with multiple users, where there is a limited number of training samples and unexpected interference. To improve the model generalization ability and reduce the model size, we propose a knowledge distillation (KD) based system where Transformer based encoder decoder is implemented as the semantic encoder-decoder and fully connected neural networks are implemented as the channel encoder-decoder. Specifically, four types of knowledge transfer and model compression are analyzed. Important system and model parameters are considered, including the level of noise and interference, the number of interfering users and the size of the encoder and decoder. Numerical results demonstrate that KD significantly improves the robustness and the generalization ability when applied to unexpected interference, and it reduces the performance loss when compressing the model size.

Item Type:Article
Divisions:Science > School of Mathematical, Physical and Computational Sciences > Department of Computer Science
ID Code:114136


Downloads per month over past year

University Staff: Request a correction | Centaur Editors: Update this record

Page navigation