LSTM Attention vs Self-Attention: How Bahdanau Evolved
Bahdanau's 2014 attention fixed seq2seq bottlenecks but kept sequential encoders. How three key problems led to self-attention and 3x speedups.
Read the full article: LSTM Attention vs Self-Attention: How Bahdanau Evolved
You're receiving this because you subscribed to TildAlice newsletter. | #Attention Mechanism, #Transformer, #LSTM, #Seq2seq, #NMT
Don't miss what's next. Subscribe to TildAlice Dev Weekly: