Project Icon

splade

Leveraging BERT for Enhanced Sparse Lexical Retrieval Models

Product DescriptionSPLADE utilizes BERT to build sparse models that enhance the first-stage ranking in information retrieval tasks. With the adoption of sparse representations, the models achieve efficiency gains and clarity in lexical matching. Recent improvements include static pruning for neural retrievers and advanced training techniques. The models are versatile across various domains. Pre-trained versions are accessible on Hugging Face, allowing for efficient performance comparable to traditional methods, with reduced latency.
Project Details