码迷,mamicode.com
首页 > 其他好文 > 详细

knowledge_based topic model KBTM

时间:2015-03-03 18:40:31      阅读:139      评论:0      收藏:0      [点我收藏+]

标签:topic model   knowledge_based   df-lda   lda   

http://blog.csdn.net/pipisorry/article/details/44040701

术语

Mustlink states that two words should belong to the same topic
Cannot-link states that two words should not belong to the same topic.

DF-LDA

is perhaps the earliest KBTM, which can incorporate two forms of prior knowledge from the user: must-links and cannot-links.

[Andrzejewski, David, Zhu, Xiaojin, and Craven, Mark. Incorporating domain knowledge into topic modeling via Dirichlet Forest priors. In ICML, pp. 25–32, 2009.]


DF-LDA [1]: A knowledge-based topic model that can use both must-links and cannot-links, but it assumes all the knowledge is correct.
MC-LDA [10]: A knowledge-based topic model that also use both the must-link and the cannot-link knowledge. It assumes that all knowledge is correct as well.
GK-LDA [9]: A knowledge-based topic model that uses the ratio of word probabilities under each topic to reduce the effect of wrong knowledge. However, it can only use the must-link type of knowledge.
LTM [7]: A lifelong learning topic model that learns only the must-link type of knowledge automatically. It outperformed [8].


from:http://blog.csdn.net/pipisorry/article/details/44040701

knowledge_based topic model KBTM

标签:topic model   knowledge_based   df-lda   lda   

原文地址:http://blog.csdn.net/pipisorry/article/details/44040701

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!