码迷,mamicode.com
首页 > 系统相关 > 详细

Attention machenism

时间:2020-01-29 20:10:18      阅读:133      评论:0      收藏:0      [点我收藏+]

标签:bre   out   https   sel   long   which   while   rod   and   

from attention mechanism

Attention is one component of a network’s architecture, and is in charge of managing and quantifying the interdependence.

  1. Between the input and output elements (General Attention)
  2. Within the input elements (Self-Attention)

While Attention does have its application in other fields of deep learning such as Computer Vision, its main breakthrough and success come from its application in Natural Language Processing (NLP) tasks. This is due to the fact that Attention was introduced to address the problem of long sequences in Machine Translation, which is also a problem for most other NLP tasks as well.

Attention machenism

标签:bre   out   https   sel   long   which   while   rod   and   

原文地址:https://www.cnblogs.com/dulun/p/12240867.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!