SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 911920 of 4002 papers

TitleStatusHype
Definition Frames: Using Definitions for Hybrid Concept RepresentationsCode0
CoSimLex: A Resource for Evaluating Graded Word Similarity in ContextCode0
Projective Methods for Mitigating Gender Bias in Pre-trained Language ModelsCode0
ProMap: Effective Bilingual Lexicon Induction via Language Model PromptingCode0
A Bi-Encoder LSTM Model For Learning Unstructured DialogsCode0
Definition Modeling: Learning to define word embeddings in natural languageCode0
An Interpretable and Uncertainty Aware Multi-Task Framework for Multi-Aspect Sentiment AnalysisCode0
Creative Contextual Dialog Adaptation in an Open World RPGCode0
Query and Output: Generating Words by Querying Distributed Word Representations for Paraphrase GenerationCode0
Deep word embeddings for visual speech recognitionCode0
Show:102550
← PrevPage 92 of 401Next →

No leaderboard results yet.