Flowformer
Flowformer tackles the issue of quadratic complexity in traditional transformers by proposing a linear complexity model based on flow network theory. It facilitates efficient processing of lengthy sequences over 4,000 tokens with task-universal application, suitable for various domains such as Long Sequence Modeling, Vision, NLP, Time Series, and Reinforcement Learning. Flow-Attention design enhances resource allocation through competitive dynamics, underpinned by solid theoretical foundations, promising advancements in transformer technology with proven superior performance metrics.