mrs2vec: Word Embeddings with Semantic Contexts

Published in UW Linguistics, 2017

Abstract:

Mapping words to real-valued vectors for use in NLU applications has attracted renewed interest from researchers in the past few years due to advances in neural techniques. Previous work has incorporated forms of syntactic and semantic knowledge into the word-embedding process with positive results. None, however, has leveraged a syntactic resource as precise as the English Resource Grammar (ERG). Moreover, the ERG parse of a sentence contains a semantic graph derived from the syntax which expresses compositional semantic relations between the entities and events denoted. This paper outlines mrs2vec, a method for training word embeddings from semantic dependencies as given by the ERG. I show how these embeddings compare to state-of-the-art embeddings which incorporate syntactic and non-compositional lexical semantic knowledge.

PDF

Bibtex:

@article{koncelmrs2vec,
  title={mrs2vec: Word Embedding with Semantic Contexts},
  author={Koncel-Kedziorski, Rik}
}