标签:bre out https sel long which while rod and
from attention mechanism
Attention is one component of a network’s architecture, and is in charge of managing and quantifying the interdependence.
While Attention does have its application in other fields of deep learning such as Computer Vision, its main breakthrough and success come from its application in Natural Language Processing (NLP) tasks. This is due to the fact that Attention was introduced to address the problem of long sequences in Machine Translation, which is also a problem for most other NLP tasks as well.
标签:bre out https sel long which while rod and
原文地址:https://www.cnblogs.com/dulun/p/12240867.html