Project Icon

ttt-lm-pytorch

Improving RNN Efficiency Using Expressive Test-Time Training Layers

Product DescriptionDiscover sequence modeling layers offering linear complexity and expressive hidden states to enhance RNN efficiency for extensive contexts. This PyTorch version highlights Test-Time Training (TTT) layers, with hidden states that adapt during testing. It features TTT-Linear and TTT-MLP layers geared for inference, suitable for seamless use with Huggingface Transformers.
Project Details