Project Icon

spacy-transformers

Utilize Advanced Transformers in spaCy for Improved NLP Processing

Product DescriptionThis package integrates Hugging Face transformers like BERT, GPT-2, and XLNet into spaCy, providing a seamless blend into NLP workflows. Designed for spaCy v3, it features multi-task learning, automated token alignment, and customization options for transformer outputs. Installation is user-friendly via pip, compatible with both CPU and GPU. Though direct task-specific heads are unsupported, prediction outputs for text classification are accessible through wrappers.
Project Details