SOTAVerified

Semantic Importance-Aware Communications Using Pre-trained Language Models

2023-02-12Unverified0· sign in to hype

Shuaishuai Guo, Yanhu Wang, Shujing Li, Nasir Saeed

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This letter proposes a semantic importance-aware communication (SIAC) scheme using pre-trained language models (e.g., ChatGPT, BERT, etc.). Specifically, we propose a cross-layer design with a pre-trained language model embedded in/connected by the cross-layer manager. The pre-trained language model is utilized to quantify the semantic importance of data frames. Based on the quantified semantic importance, we investigate semantic importance-aware power allocation. Unlike existing deep joint source-channel coding (Deep-JSCC)-based semantic communication schemes, SIAC can be directly embedded into current communication systems by only introducing a cross-layer manager. Our experimental results show that the proposed SIAC scheme can achieve lower semantic loss than existing equal-priority communications.

Tasks

Reproductions