GemBode and PhiBode: Adapting Small Language Models to Brazilian Portuguese
Carregando...
Arquivos
Fontes externas
Fontes externas
Data
Orientador
Coorientador
Pós-graduação
Curso de graduação
Título da Revista
ISSN da Revista
Título de Volume
Editor
Tipo
Trabalho apresentado em evento
Direito de acesso
Arquivos
Fontes externas
Fontes externas
Resumo
Recent advances in generative capabilities provided by large language models have reshaped technology research and human society’s cognitive abilities, bringing new innovative capacities to artificial intelligence solutions. However, the size of such models has raised several concerns regarding their alignment with hardware-limited resources. This paper presents a comprehensive study on training Portuguese-focused Small Language Models (SLMs). We have developed a unique dataset for training our models and employed full fine-tuning, as well as PEFT approaches for comparative analysis. We used Microsoft’s Phi and Google’s Gemma as base models to create our own, named PhiBode and GemBode. These models range from approximately 1 billion to 7 billion parameters, with a total of ten models developed. Our findings provide valuable insights into the performance and applicability of these models, contributing significantly to the field of Portuguese language processing. This research is a step forward in understanding and improving the performance of SLMs in Portuguese. The comparative analysis of the models provides a clear benchmark for future research in this area. The results demonstrate the effectiveness of our training methods and the potential of our models for various applications. This paper significantly contributes to language model training, particularly for the Portuguese language.
Descrição
Palavras-chave
Bode, Generative Artificial Intelligence, Natural Language Processing, Portuguese, Small Language Models
Idioma
Inglês
Citação
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), v. 15368 LNCS, p. 228-243.




