curated-transformers
Curated Transformers is a PyTorch library providing modular transformer models that ensure efficient feature reuse and easy scalability. It supports large language models such as Falcon, Llama, and Dolly v2 and works seamlessly within the PyTorch environment while maintaining minimal dependencies. With straightforward type annotations, it suits educational purposes and integrates well with type-checked codebases. Used by Explosion and set as the default in spaCy 3.7, it offers compatibility with diverse architectures like BERT and GPT variants through Huggingface Hub.