Chinese-Mixtral-8x7B
This project leverages the Mixtral-8x7B model, enhanced with an expanded Chinese vocabulary to improve NLP capabilities. It offers open-source access to both the expanded model and incremental pre-training code, which notably boosts encoding and decoding efficiency in Chinese, ensuring strong comprehension and generation potentials. Users should remain attentive to potential biases or inaccuracies in outputs. The model is compatible with various acceleration techniques within the Mixtral-8x7B ecosystem and can be downloaded including options for integrating LoRA weights.