ttt-lm-pytorch
Discover sequence modeling layers offering linear complexity and expressive hidden states to enhance RNN efficiency for extensive contexts. This PyTorch version highlights Test-Time Training (TTT) layers, with hidden states that adapt during testing. It features TTT-Linear and TTT-MLP layers geared for inference, suitable for seamless use with Huggingface Transformers.