SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 21412150 of 4002 papers

TitleStatusHype
Enriching Phrase Tables for Statistical Machine Translation Using Mixed Embeddings0
Enriching Word Embeddings with Domain Knowledge for Readability Assessment0
Ensemble Methods to Distinguish Mainland and Taiwan Chinese0
Ensemble Romanian Dependency Parsing with Neural Networks0
Entity-Aware Dependency-Based Deep Graph Attention Network for Comparative Preference Classification0
Entity-Centric Contextual Affective Analysis0
Entity Linking for Queries by Searching Wikipedia Sentences0
Entropy-Based Subword Mining with an Application to Word Embeddings0
Equalizing Gender Bias in Neural Machine Translation with Word Embeddings Techniques0
Equipping Educational Applications with Domain Knowledge0
Show:102550
← PrevPage 215 of 401Next →

No leaderboard results yet.