SOTAVerified

The Large Language Model GreekLegalRoBERTa

2024-10-10Unverified0· sign in to hype

Vasileios Saketos, Despina-Athanasia Pantazi, Manolis Koubarakis

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We develop four versions of GreekLegalRoBERTa, which are four large language models trained on Greek legal and nonlegal text. We show that our models surpass the performance of GreekLegalBERT, Greek- LegalBERT-v2, and GreekBERT in two tasks involving Greek legal documents: named entity recognition and multi-class legal topic classification. We view our work as a contribution to the study of domain-specific NLP tasks in low-resource languages, like Greek, using modern NLP techniques and methodologies.

Tasks

Reproductions