Project Icon

beto

Spanish Pre-Trained BERT Model for Enhanced Language Processing

Product DescriptionBETO is a Spanish language BERT model trained on a vast Spanish corpus using the Whole Word Masking technique. Featuring architectures similar to BERT-Base, BETO offers both uncased and cased versions tailored for Spanish natural language processing tasks. It shows better performance than Multilingual BERT in various benchmarks, such as POS and NER-C, improving accuracy for Spanish language tasks. With a 31k BPE vocabulary, it ensures comprehensive coverage of linguistic structures. Available via the HuggingFace Transformers library, BETO supports a wide range of NLP applications in Spanish.
Project Details