SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 431440 of 4002 papers

TitleStatusHype
Unsupervised Lexical Substitution with Decontextualised EmbeddingsCode0
Integrating Form and Meaning: A Multi-Task Learning Model for Acoustic Word EmbeddingsCode0
Evaluation of Question Answering Systems: Complexity of judging a natural language0
Visual Grounding of Inter-lingual Word-Embeddings0
User recommendation system based on MIND dataset0
Layer or Representation Space: What makes BERT-based Evaluation Metrics Robust?Code0
Knowledge-aware attentional neural network for review-based movie recommendation with explanationsCode0
Improving Translation of Out Of Vocabulary Words using Bilingual Lexicon Induction in Low-Resource Machine Translation0
Gender bias Evaluation in Luganda-English Machine Translation0
Do gender neutral affixes naturally reduce gender bias in static word embeddings?0
Show:102550
← PrevPage 44 of 401Next →

No leaderboard results yet.