SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 931940 of 4002 papers

TitleStatusHype
The Effect of Pretraining on Extractive Summarization for Scientific Documents0
Measuring Biases of Word Embeddings: What Similarity Measures and Descriptive Statistics to Use?0
Profiling of Intertextuality in Latin Literature Using Word EmbeddingsCode0
Query2Prod2Vec: Grounded Word Embeddings for eCommerceCode1
Morphology-Aware Meta-Embeddings for TamilCode0
Cross-Lingual Word Embedding Refinement by _1 Norm OptimisationCode1
Field Embedding: A Unified Grain-Based Framework for Word Representation0
Data Filtering using Cross-Lingual Word Embeddings0
Gender Bias Hidden Behind Chinese Word Embeddings: The Case of Chinese Adjectives0
LenAtten: An Effective Length Controlling Unit For Text SummarizationCode0
Show:102550
← PrevPage 94 of 401Next →

No leaderboard results yet.