Project Icon

lmppl

Enhance Your Text Analysis Using Language Model Perplexity Tools

Product DescriptionLM-PPL: An Efficient Tool designed to calculate text perplexity using pre-trained language models such as GPT, BERT, and T5. It aids in assessing text fluency by computing ordinary perplexity for recurrent models and decoder perplexity for encoder-decoder models, while utilizing pseudo-perplexity for masked models. Suitable for a range of applications like sentiment analysis, LM-PPL helps select texts with lower perplexity, ensuring better model predictions. Installable via pip, it offers a user-friendly way to leverage popular models for varied text evaluation needs.
Project Details