SOTAVerified

A Modern Turkish Poet: Fine-Tuned GPT-2

2023-09-158th International Conference on Computer Science and Engineering (UBMK) 2023Code Available0· sign in to hype

Uygar Kurt, Aykut Çayır

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Generative tasks are getting more realistic thanks to the improvements in deep learning. Text generation is getting increasingly important as LLMs (large language models) get more advanced. Even though ChatGPT gave rise to many computer-generated literary works, it has several limitations, such as not being open-sourced and can't be fine-tuned. Because other LLMs are understudied in Turkish, they usually can't be used efficiently to generate literary work. In this paper, we fine-tuned GPT-2 [1] models on sixty Turkish poem categories and trained five models that convey different emotions and write on different topics. The results are gathered in a 70-page Turkish poem book named “Gerçekligin İçinde” that contains 50 poems in total, 10 poems for each chapter. The book got published by Amazon [2], Goodreads [3] and Google Books [4].

Tasks

Reproductions