contextualized-topic-models
Enhancing multilingual topic coherence with Contextualized Topic Models using BERT. These models integrate contextual and traditional bag-of-words approaches, using CombinedTM for coherence and ZeroShotTM for language diversity. Adaptable to any pre-trained embedding, this framework supports cutting-edge topic modeling. Emphasizing multilingual applications, it predicts topics in unseen data efficiently. Detailed tutorials and documentation support both language-specific and cross-lingual tasks. Discover intuitive human-in-the-loop classification with Kitty, swiftly identifying document clusters. This open-source project benefits from community support and is available under the MIT license.