Project Icon

TextPruner

Utilize Training-Free Pruning to Optimize Pre-Trained Language Models

Product DescriptionLearn efficient techniques to reduce the size and increase the speed of language models without retraining. TextPruner supports models like BERT and RoBERTa, maintaining performance in NLP tasks. Use it as a Python package or CLI with examples provided. Access continuous updates and research for various language applications.
Project Details