PE-GPT: A Physics-Informed Interactive Large Language Model for Power Converter Modulation Design
Fanfan Lin, Junhua Liu, Xinze Li, Shuai Zhao, Bohui Zhao, Hao Ma, Xin Zhang
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
This paper proposes PE-GPT, a custom-tailored large language model uniquely adapted for power converter modulation design. By harnessing in-context learning and specialized tiered physics-informed neural networks, PE-GPT guides users through text-based dialogues, recommending actionable modulation parameters. The effectiveness of PE-GPT is validated through a practical design case involving dual active bridge converters, supported by hardware experimentation. This research underscores the transformative potential of large language models in power converter modulation design, offering enhanced accessibility, explainability, and efficiency, thereby setting a new paradigm in the field.