Efficient Learning for Undirected Topic Models

Jiatao Gu and Victor O.K. Li

The 53th Annual Meeting of the Association for Computational Linguistics (ACL), Short Paper Track, 2015


png
Abstract
Replicated Softmax model, a well-known undirected topic model, is powerful in ex- tracting semantic representations of documents. Traditional learning strategies such as Contrastive Divergence are very inefficient. This paper provides a novel estimator to speed up the learning based on Noise Contrastive Estimation, extended for documents of variant lengths and weighted inputs. Experiments on two benchmarks show that the new estimator achieves great learning efficiency and high accuracy on document retrieval and classification.

[paper] [code]

Please cite as: