Modeling order in neural word embeddings at scale

A Trask, D Gilmore, M Russell - International Conference on …, 2015 - proceedings.mlr.press
A Trask, D Gilmore, M Russell
International Conference on Machine Learning, 2015proceedings.mlr.press
Abstract Natural Language Processing (NLP) systems commonly leverage bag-of-words co-
occurrence techniques to capture semantic and syntactic word relationships. The resulting
word-level distributed representations often ignore morphological information, though
character-level embeddings have proven valuable to NLP tasks. We propose a new neural
language model incorporating both word order and character order in its embedding. The
model produces several vector spaces with meaningful substructure, as evidenced by its …
Abstract
Natural Language Processing (NLP) systems commonly leverage bag-of-words co-occurrence techniques to capture semantic and syntactic word relationships. The resulting word-level distributed representations often ignore morphological information, though character-level embeddings have proven valuable to NLP tasks. We propose a new neural language model incorporating both word order and character order in its embedding. The model produces several vector spaces with meaningful substructure, as evidenced by its performance of 85.8% on a recent word-analogy task, exceeding best published syntactic word-analogy scores by a 58% error margin. Furthermore, the model includes several parallel training methods, most notably allowing a skip-gram network with 160 billion parameters to be trained overnight on 3 multi-core CPUs, 14x larger than the previous largest neural network.
proceedings.mlr.press