kogpt
KoGPT by KakaoBrain is a Korean generative pre-trained transformer designed for tasks such as classification, search, summarization, and generation of Korean text. It features over 6 billion parameters with 28 layers, requiring at least 32GB of GPU RAM for optimal functioning. Available in various precision formats, including float16, this reduces memory use. Users should be aware of the potential generation of sensitive content due to training on raw data. Learn more about its specifications for integration into AI applications.