You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
https://huggingface.co/papers/2601.07832
This new attention method has same efficiency as linear attention and improve accuracy by 41% so it would be worth trying on LTX-2 :)