SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 38713880 of 4002 papers

TitleStatusHype
CARER: Contextualized Affect Representations for Emotion RecognitionCode0
AWE-CM Vectors: Augmenting Word Embeddings with a Clinical MetathesaurusCode0
MICE: Mining Idioms with Contextual EmbeddingsCode0
Think Globally, Embed Locally --- Locally Linear Meta-embedding of WordsCode0
Mimicking Word Embeddings using Subword RNNsCode0
Guided Open Vocabulary Image Captioning with Constrained Beam SearchCode0
Mind Your Bias: A Critical Review of Bias Detection Methods for Contextual Language ModelsCode0
Solving ARC visual analogies with neural embeddings and vector arithmetic: A generalized methodCode0
Query Focused Multi-document Summarisation of Biomedical TextsCode0
Query Focused Multi-document Summarisation of Biomedical Texts: Macquarie Universiy and the Australian National University at BioASQ8bCode0
Show:102550
← PrevPage 388 of 401Next →

No leaderboard results yet.