Dependency-Based Word Embeddings

태그
작성일자
1 more property
Oct 24, 2018 deep-learning, natural-language-processing

WHY?

Traditional continuous word embeddings based on linear contexts. In other words, word embeddings considered only surrounding words as context.

WHAT?

This paper introduces dependency based word embedding to capture more meaningful context. The result of dependency parsing of sentences is used as context instead of surrounding words.

So?

While previous BoW reflect the domain aspect, dependency based embedding capture semantic type of the target word. In other words, while BoW find words that associate with w, Deps find words that behave like w.

Critic