SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 14111420 of 4002 papers

TitleStatusHype
Contextual and Non-Contextual Word Embeddings: an in-depth Linguistic Investigation0
Metaphor Detection Using Contextual Word Embeddings From Transformers0
Visual Question Generation from Radiology ImagesCode1
Evaluating Natural Alpha Embeddings on Intrinsic and Extrinsic Tasks0
Character aware models with similarity learning for metaphor detection0
Neural Metaphor Detection with a Residual biLSTM-CRF Model0
Token Level Identification of Multiword Expressions Using Contextual Information0
Getting the \#\#life out of living: How Adequate Are Word-Pieces for Modelling Complex Morphology?0
LIT Team's System Description for Japanese-Chinese Machine Translation Task in IWSLT 20200
Word Embeddings as Tuples of Feature Probabilities0
Show:102550
← PrevPage 142 of 401Next →

No leaderboard results yet.