SOTAVerified

DaG LLM ver 1.0: Pioneering Instruction-Tuned Language Modeling for Korean NLP

2023-11-23Unverified0· sign in to hype

Dongjun Jang, Sangah Lee, Sungjoo Byun, Jinwoong Kim, Jean Seo, Minseok Kim, Soyeon Kim, Chaeyoung Oh, Jaeyoon Kim, Hyemi Jo, Hyopil Shin

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This paper presents the DaG LLM (David and Goliath Large Language Model), a language model specialized for Korean and fine-tuned through Instruction Tuning across 41 tasks within 13 distinct categories.

Tasks

Reproductions