based topic model KBTM概述

术语

Mustlink states that two words should belong to the same topicCannot-link states that two words should not belong to the same topic.

DF-LDA

is perhaps the earliest KBTM, which can incorporate two forms of prior knowledge from the user: must-links and cannot-links.

[Andrzejewski, David, Zhu, Xiaojin, and Craven, Mark. Incorporating domain knowledge into topic modeling via Dirichlet Forest priors. In ICML, pp. 25–32, 2009.]

DF-LDA [1]: A knowledge-based topic model that can use both must-links and cannot-links, but it assumes all the knowledge is correct.

GK-LDA (Chen et al., 2013a)

A knowledge-based topic model that uses the ratio of word probabilities under each topic to reduce the effect of wrong knowledge. However, it can only use the must-link type of knowledge.

Note: although both DF-LDA and GK-LDA can take prior knowledge from the user, they cannot mine any prior knowledge.

MC-LDA (Chen et al., 2013b)

is a recent knowledge-based model for aspect extraction.

A knowledge-based topic model that also use both the must-link and the cannot-link knowledge. It assumes that all knowledge is correct as well.

AKL (Automated Knowledge LDA),(Chen et al., 2014: Aspect Extraction with Automated Prior Knowledge Learning):

A knowledge-based topic model that applies clustering to learn the knowledge and utilizes the knowledge in the form of knowledge clusters.

whose inference can exploit the automatically learned prior knowledge and handle the issues of incorrect knowledge to produce superior aspects.

LTM [7]:

A lifelong learning topic model that learns only the must-link type of knowledge automatically. It outperformed [8].

from:

,对于旅行,从来都记忆模糊。记不得都去了哪些地方,

based topic model KBTM概述

相关文章:

你感兴趣的文章:

标签云: