SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 34013410 of 4002 papers

TitleStatusHype
Recent Developments within BulTreeBank0
Dual Embeddings and Metrics for Relational Similarity0
Incorporating visual features into word embeddings: A bimodal autoencoder-based approach0
Unsupervised Induction of Compositional Types for English Adjective-Noun Pairs0
Learning to Compose Spatial Relations with Grounded Neural Language Models0
Distributional Lesk: Effective Knowledge-Based Word Sense Disambiguation0
Sense Embeddings in Knowledge-Based Word Sense DisambiguationCode0
Distributional regularities of verbs and verbal adjectives: Treebank evidence and broader implications0
Using Neural Word Embeddings in the Analysis of the Clinical Semantic Verbal Fluency Task0
Neural Disambiguation of Causal Lexical Markers Based on ContextCode0
Show:102550
← PrevPage 341 of 401Next →

No leaderboard results yet.