KR-BERT
Created at Seoul National University, the KR-BERT is a Korean-focused pre-trained model that excels in processing Korean text with its advanced BidirectionalWordPiece tokenization. It offers distinct character and sub-character analysis, enhancing performance in tasks like sentiment analysis. The model is equipped with targeted vocabulary and versatile tokenization designed for effective Korean NLP applications.