2013-12-3
Continuous Space Language Model(NNLM)
- slider [1]
- related paper
- [1] Holger Schwenk; CSLM - A modular Open-Source Continuous Space Language Modeling Toolkit, in
Interspeech, August 2013.
- [2] Y. Bengio, and R. Ducharme. A neural probabilistic language model. In Neural Information
Processing Systems, volume 13, pages 932-938. 2001
- [3] Tomas Mikolov, Kai Chen, Greg Corrado, and Jeffrey Dean. Efficient Estimation of Word
Representations in Vector Space. In Proceedings of Workshop at ICLR, 2013.
- [4]Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg Corrado, and Jeffrey Dean. Distributed
Representations of Words and Phrases and their Compositionality. In Proceedings of NIPS, 2013.
- [5]Tomas Mikolov, Wen-tau Yih, and Geoffrey Zweig. Linguistic Regularities in Continuous Space Word
Representations. In Proceedings of NAACL HLT, 2013.
- [6] ngram smoothing http://nlp.stanford.edu/~wcmac/papers/20050421-smoothing-tutorial.pdf
- Mikolov, Tomas, Wen-tau Yih, and Geoffrey Zweig. Linguistic regularities in continuous space
word representations. Proceedings of NAACL-HLT. 2013
- [7] Ronan Collobert, Jason Weston, Léon Bottou, Michael Karlen, Koray Kavukcuoglu and Pavel
Kuksa.Natural Language Processing (Almost) from Scratch. Journal of Machine Learning Research (JMLR), 12:2493-2537, 2011.
- [8] Andriy Mnih & Geoffrey Hinton. Three new graphical models for statistical language modelling.
International Conference on Machine Learning (ICML). 2007.
- [9] Andriy Mnih & Geoffrey Hinton. A scalable hierarchical distributed language model. The
Conference on Neural Information Processing Systems (NIPS) (pp. 1081–1088). 2008.
- [10] Mikolov Tomáš. Statistical Language Models based on Neural Networks. PhD thesis, Brno
University of Technology. 2012.
- [11] Turian, Joseph, Lev Ratinov, and Yoshua Bengio. Word representations: a simple and general
method for semi-supervised learning. Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics (ACL). 2010.